Search This Blog

Philosophy of Information Bibliography...

Matching entries: 0
settings...
AuthorTitleYearJournal/ProceedingsReftypeDOI/URL
Bawden, D. and Robinson, L. "Deep down things": in what ways is information physical, and why does it matter for information science? 2013 INFORMATION RESEARCH-AN INTERNATIONAL ELECTRONIC JOURNAL
Vol. 18(3) 
article URL 
Abstract: Introduction. Rolf Landauer declared in 1991 that 'information is physical'. Since then, information has come to be seen by many physicists as a fundamental component of the physical world; indeed by some as the physical component. This idea is now gaining currency in popular science communication. However, it is often far from clear what exactly this statement means; exactly how is information physical? And why this should matter for information science? The purpose of this paper is to clarify just what is meant by the physical nature of information, and the significance of these considerations for our discipline. Methods. A selective literature review and conceptual analysis, based on literature from both physical science and information science. Results. The prospect of attempting to make links between objective and subjective conceptions of information has been strongly advocated by some authors and doubted by others. The physical nature of information can be understood from three main perspectives: the relation between information and physical entropy; the strongly informational nature of the quantum view of nature; and the possibility of recasting physical laws in informational terms. Conclusions. Based on this analysis, we muse on the relevance of such issues to information science, with particular reference to emergent properties of information. Apart from the added public awareness of the i-word in a very different context from the norm, it may that that there are general laws and principles, or at least useful metaphors and analogies, linking the concept of information in the physical, biological and social domains.
BibTeX:
@article{bawden_deep_2013,
  author = {Bawden, D. and Robinson, L.},
  title = {"Deep down things": in what ways is information physical, and why does it matter for information science?},
  journal = {INFORMATION RESEARCH-AN INTERNATIONAL ELECTRONIC JOURNAL},
  year = {2013},
  volume = {18},
  number = {3},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V1LS8NAEF4EL4KIT6xaGIr2srbUvGwFkdJGLNQqNp5LHhuaQ0PBQOi_dya7adLYQ89CTsvusul-nf1mMvMtY7rW7rQqNuHRMANdILU3jdA03V7gIs3QPSMUhq6FoV6JbOcbX7T9h42_07ShEEseoH-NtJIi4dhEnn8U83TuJjx1Vz90kbkSTc0QsFT7lWdzpvMVzkDZWglfZBqcWUJieYg6PCupgaMJ-pXv8jKf_GtUqz_hm9q79tgeOF8fE8SASj8qQqppbgqLj0FFjdq4HKagKyN65TBF5s1O35B8j-zxkA_tT4eXF1Tmv9IS6xa6t5YsVG2LLW1_zPfau99Q1a6cduscRPIGkes8kBXDU7pJUuuLIPKTZxG3vqd4qlP1ZE6wlQvfMTLp5vVCttKVjJo4x-xI-RTQl1g4YXsiPmV1VZECTRgVmwbKlp-xqEEoAUIJSJQ0niCKgRAChBCI8CmNzBFyD4gP7LYCwgdECUh8AHbdGKBW-3LO-KvtIArK7zBbSpmTWeUH0i7YoUtVF3GSVWcGlwx8reN3kWV1A93HExe5n-X1LA85qOe6Qtdq7HaXqWvsYZduA6VaT2oNydVuU1-zgwKNN2w_xP-2qEvNzV_ZC3di}
}
Shannon, C.E., Sloane, N.J.A., Wyner, A.D. and Society, I.I.T. "Information Theory" from Encyclopedia Britannica 14th edition 1968 1993 Claude Elwood Shannon: collected papers, pp. 212-220  incollection  
BibTeX:
@incollection{shannon_information_1993,
  author = {Shannon, Claude E. and Sloane, N. J. A. and Wyner, A. D. and Society, IEEE Information Theory},
  title = {"Information Theory" from Encyclopedia Britannica 14th edition 1968},
  booktitle = {Claude Elwood Shannon: collected papers},
  publisher = {IEEE Press},
  year = {1993},
  pages = {212--220}
}
Goldman, A. A Causal Theory of Knowing 1967 The Journal of Philosophy
Vol. 64, pp. 357-372 
article  
BibTeX:
@article{Goldman1967,
  author = {Goldman, A.},
  title = {A Causal Theory of Knowing},
  journal = {The Journal of Philosophy},
  year = {1967},
  volume = {64},
  pages = {357-372}
}
Adriaans, P. A Critical Analysis of Floridi's Theory of Semantic Information 2010 Knowledge, Technology & Policy
Vol. 23(1-2), pp. 41-56 
article  
Abstract: Issue Title: Special Issue: Luciano Floridi's Philosophy of Technology: Critical Reflections / Guest Edited by Hilmi Demir In various publications over the past years, Floridi has developed a theory of semantic information as well-formed, meaningful, and truthful data. This theory is more or less orthogonal to the standard entropy-based notions of information known from physics, information theory, and computer science that all define the amount of information in a certain system as a scalar value without any direct semantic implication. In this context the question rises what the exact relation between these various conceptions of information is and whether there is a real need to enrich these mathematically more or less rigid definitions with a less formal notion of semantic information. I investigate various philosophical aspects of the more formal definitions of information in the light of Floridi's theory. The position I defend is that the formal treatment of the notion of information as a general theory of entropy is one of the fundamental achievements of modern science that in itself is a rich source for new philosophical reflection. This makes information theory a competitor of classical epistemology rather than a servant. In this light Floridi's philosophy of information is more a reprise of classical epistemology that only pays lip service to information theory but fails to address the important central questions of philosophy of information. Specifically, I will defend the view that notions that are associated with truth, knowledge, and meaning all can adequately be reconstructed in the context of modern information theory and that consequently there is no need to introduce a concept of semantic information.[PUBLICATION ABSTRACT]
BibTeX:
@article{Adriaans2010,
  author = {Adriaans, P},
  title = {A Critical Analysis of Floridi's Theory of Semantic Information},
  journal = {Knowledge, Technology & Policy},
  year = {2010},
  volume = {23},
  number = {1-2},
  pages = {41--56}
}
Floridi, L. A Defence of Informational Structural Realism 2008 Synthese
Vol. 161(2), pp. 219-253 
article URL 
Abstract: This is the revised version of an invited keynote lecture delivered at the 1st Australian Computing and Philosophy Conference (CAP@AU; the Australian National University in Canberra, 31 October-2 November, 2003). The paper is divided into two parts. The first part defends an informational approach to structural realism. It does so in three steps. First, it is shown that, within the debate about structural realism (SR), epistemic (ESR) and ontic (OSR) structural realism are reconcilable. It follows that a version of OSR is defensible from a structuralist-friendly position. Second, it is argued that a version of OSR is also plausible, because not all relata (structured entities) are logically prior to relations (structures). Third, it is shown that a version of OSR is also applicable to both sub-observable (unobservable and instrumentally-only observable) and observable entities, by developing its ontology of structural objects in terms of informational objects. The outcome is informational structural realism, a version of OSR supporting the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. The paper has been discussed by several colleagues and, in the second half, ten objections that have been moved to the proposal are answered in order to clarify it further. [PUBLICATION ABSTRACT]; This is the revised version of an invited keynote lecture delivered at the 1st Australian Computing and Philosophy Conference (CAP@AU; the Australian National University in Canberra, 31 October–2 November, 2003). The paper is divided into two parts. The first part defends an informational approach to structural realism. It does so in three steps. First, it is shown that, within the debate about structural realism (SR), epistemic (ESR) and ontic (OSR) structural realism are reconcilable. It follows that a version of OSR is defensible from a structuralist-friendly position. Second, it is argued that a version of OSR is also plausible, because not all relata (structured entities) are logically prior to relations (structures). Third, it is shown that a version of OSR is also applicable to both sub-observable (unobservable and instrumentally-only observable) and observable entities, by developing its ontology of structural objects in terms of informational objects. The outcome is informational structural realism, a version of OSR supporting the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. The paper has been discussed by several colleagues and, in the second half, ten objections that have been moved to the proposal are answered in order to clarify it further.; This is the revised version of an invited keynote lecture delivered at the "1st Australian Computing and Philosophy Conference" (CAP@AU; the Australian National University in Canberra, 31 October—2 November, 2003). The paper is divided into two parts. The first part defends an informational approach to structural realism. It does so in three steps. First, it is shown that, within the debate about structural realism (SR), epistemic (ESR) and ontic (OSR) structural realism are reconcilable. It follows that a version of OSR is defensible from a structuralist-friendly position. Second, it is argued that a version of OSR is also plausible, because not all relata (structured entities) are logically prior to relations (structures). Third, it is shown that a version of OSR is also applicable to both sub-observable (unobservable and instrumentally-only observable) and observable entities, by developing its ontology of structural objects in terms of informational objects. The outcome is informational structural realism, a version of OSR supporting the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. The paper has been discussed by several colleagues and, in the second half, ten objections that have been moved to the proposal are answered in order to clarify it further.; This is the revised version of an invited keynote lecture delivered at the 1st Australian Computing and Philosophy Conference (CAP@AU; the Australian National University in Canberra, 31 October-2 November, 2003). The paper is divided into two parts. The first part defends an informational approach to structural realism. It does so in three steps. First, it is shown that, within the debate about structural realism (SR), epistemic (ESR) and ontic (OSR) structural realism are reconcilable. It follows that a version of OSR is defensible from a structuralist-friendly position. Second, it is argued that a version of OSR is also plausible, because not all relata (structured entities) are logically prior to relations (structures). Third, it is shown that a version of OSR is also applicable to both sub-observable (unobservable and instrumentally-only observable) and observable entities, by developing its ontology of structural objects in terms of informational objects. The outcome is informational structural realism, a version of OSR supporting the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. The paper has been discussed by several colleagues and, in the second half, ten objections that have been moved to the proposal are answered in order to clarify it further.; This is the revised version of an invited keynote lecture delivered at the 1st Australian Computing and Philosophy Conference (CAP@AU the Australian National University in Canberra, 31 October–2 November, 2003). The paper is divided into two parts. The first part defends an informational approach to structural realism. It does so in three steps. First, it is shown that, within the debate about structural realism (SR), epistemic (ESR) and ontic (OSR) structural realism are reconcilable. It follows that a version of OSR is defensible from a structuralist-friendly position. Second, it is argued that a version of OSR is also plausible, because not all relata (structured entities) are logically prior to relations (structures). Third, it is shown that a version of OSR is also applicable to both sub-observable (unobservable and instrumentally-only observable) and observable entities, by developing its ontology of structural objects in terms of informational objects. The outcome is informational structural realism, a version of OSR supporting the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. The paper has been discussed by several colleagues and, in the second half, ten objections that have been moved to the proposal are answered in order to clarify it further.
BibTeX:
@article{Floridi2008,
  author = {Floridi, L.},
  title = {A Defence of Informational Structural Realism},
  journal = {Synthese},
  year = {2008},
  volume = {161},
  number = {2},
  pages = {219-253},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3db9MwED8hkNAkBHSwLHxIeQAEiJQ4ThznCU0d0yR4QHxIvFmOP5660dEgwf567uwk7caQ4KlNZbeuffb9znf3OwBezov80png6s4bzQz3tfNee25bg8iitIVzrAmMG1s329NNBgVZhijB4NNHuNQt3euyETVHo-7N6iyn6lHkZR1KaeBRTBRlsZT55ExAi2BgZ2zzRtbN6NwMGXSsLNuc7usQJ_H8_IJ6Gg7pGKh4EYJSzsiP9Zb79ErFFZTU0R0Yo-LG4JTJY73Jqb8iePs___xduD2A2OwgSt0MrrnTXdj5MFZF-LULs-HMWGfPB2LrF_cgP8gOnaePs28-GxKh4l1k9inw2BIHSPbRESnjyX34cvT28-I4H8o15KYsZIX7rcZXrWllheXaOqu5bhGvadt0Ai0XWxdCa9zzjFmEfaJkxrTaaqmN4IbvwS1NYf2nfUj_swnc8LgHXUJ6McHJTODm1_b9oTx-t4iPs_Fxvg45avOzPsFVD9ORi3mzDxk3Eg9lL0QrdOUl_n6j69q0HZNedFWXwstxwdUqEnyoDZUzSYeityQd6jyFhERC0ebvv2ujGBU0RNBbpbAX1mr6jnGhsMsoNsoul2hqlWjPIKpjKTyNUrTpo9alKhSXTdUitpCSq_5nn8L-pXbk_G0QVogUXo3ytxl8GDNdQatBDOLgV9an8OyP5qEMaeyD4B-HQI1TeLItzFNj0p91KYmVkIyBFNi_NFsMbPPEstA_-NtEPYSdGItD8X2P4DqKnXscyTF_AyYQRU4}
}
Bruers, S. A discussion on maximum entropy production and information theory 2007 Journal of Physics A: Mathematical and Theoretical
Vol. 40, pp. 7441-7450 
article  
BibTeX:
@article{Bruers2007,
  author = {Bruers, S.},
  title = {A discussion on maximum entropy production and information theory},
  journal = {Journal of Physics A: Mathematical and Theoretical},
  year = {2007},
  volume = {40},
  pages = {7441-7450}
}
Floridi, L. A Distributed Model of Truth for Semantic Information 2009   inproceedings URL 
BibTeX:
@inproceedings{floridi_distributed_2009,
  author = {Floridi, Luciano},
  title = {A Distributed Model of Truth for Semantic Information},
  year = {2009},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LS8NAEB6UgghCLdaoVVi8J-a9zalIayjoQWgP3kr2hYJa28b_70ySTdFCDx6XZeHbnWG-YR47AFHo-e4fmyBQkVQskgwJ3ucmyAolqQAevfUoSLj6Hdlu42-NtK2RrCy3WkoKmt9RKhO5LRt9rVwaIkXJ1maixiF0Ao6qjOrNZ_mOla2oI–CLRuwJSNtHnnb6b5bUv1PSKfQ37bwseeWnHpwoD_PILlnE_ool2ZcacVoDto7Wxo2X3-XrwzdVzbTH_jUb5I1bUoktj7c5g_z8dS1aBaFoACJLDcLiyU8h5OCiuXpMLqvyoGOQc3WDrGNg4_hwNFL9jQZTh_H9bJnl96m6vzyVqWD5Fbdw009fgEMRSdUECXcCBObSIlUFjrRfmiEFEOZXsJgD6arvbsDOK5TNqkbxNcN2pv6m8QfOve1iA}
}
Siefe, C. A general surrenders the field, but black hole battle rages on: Stephen Hawking may have changed his mind, but questions about the fate of information continue to expose fault lines between relativity and quantum theories 2004 Science
Vol. 305(5686), pp. 934 
article  
BibTeX:
@article{siefe_general_2004,
  author = {Siefe, Charles},
  title = {A general surrenders the field, but black hole battle rages on: Stephen Hawking may have changed his mind, but questions about the fate of information continue to expose fault lines between relativity and quantum theories},
  journal = {Science},
  year = {2004},
  volume = {305},
  number = {5686},
  pages = {934}
}
Breuer, T. A gödel-turing perspective on quantum states indistinguishable from inside 2012   incollection  
BibTeX:
@incollection{breuer_go-turing_2012,
  author = {Breuer, Thomas},
  title = {A gödel-turing perspective on quantum states indistinguishable from inside},
  year = {2012}
}
Cross, C.B. A Logical Transmission Principle for Conclusive Reasons 2015 Australasian Journal of Philosophy
Vol. 93(2), pp. 353-370 
article  
Abstract: Dretske's conclusive reasons account of knowledge is designed to explain how epistemic closure can fail when the evidence for a belief does not transmit to some of that belief's logical consequences. Critics of Dretske dispute the argument against closure while joining Dretske in writing off transmission. This paper shows that, in the most widely accepted system for counterfactual logic (David Lewis's system VC), conclusive reasons are governed by an informative, non-trivial, logical transmission principle. If r is a conclusive reason for believing p in Dretske's sense, and if p logically implies q, and if p and q satisfy one additional condition, it follows that r is a conclusive reason for believing q. After introducing this additional condition, I explain its intuitive import and use the condition to shed new light on Dretske's response to scepticism, as well as on his distinction between the so-called 'lightweight' and 'heavyweight' implications of a piece of perceptual knowledge.;Dretske's conclusive reasons account of knowledge is designed to explain how epistemic closure can fail when the evidence for a belief does not transmit to some of that belief's logical consequences. Critics of Dretske dispute the argument against closure while joining Dretske in writing off transmission. This paper shows that, in the most widely accepted system for counterfactual logic (David Lewis's system VC), conclusive reasons are governed by an informative, non-trivial, logical transmission principle. If r is a conclusive reason for believing p in Dretske's sense, and if p logically implies q, and if p and q satisfy one additional condition, it follows that r is a conclusive reason for believing q. After introducing this additional condition, I explain its intuitive import and use the condition to shed new light on Dretske's response to scepticism, as well as on his distinction between the so-called 'lightweight' and 'heavyweight' implications of a piece of perceptual knowledge.; Dretske's conclusive reasons account of knowledge is designed to explain how epistemic closure can fail when the evidence for a belief does not transmit to some of that belief's logical consequences. Critics of Dretske dispute the argument against closure while joining Dretske in writing off transmission. This paper shows that, in the most widely accepted system for counterfactual logic (David Lewis's system VC), conclusive reasons are governed by an informative, non-trivial, logical transmission principle. If r is a conclusive reason for believing p in Dretske's sense, and if p logically implies q, and if p and q satisfy one additional condition, it follows that r is a conclusive reason for believing q. After introducing this additional condition, I explain its intuitive import and use the condition to shed new light on Dretske's response to scepticism, as well as on his distinction between the so-called 'lightweight' and 'heavyweight' implications of a piece of perceptual knowledge.;
BibTeX:
@article{cross_logical_2015,
  author = {Cross, Charles B.},
  title = {A Logical Transmission Principle for Conclusive Reasons},
  journal = {Australasian Journal of Philosophy},
  year = {2015},
  volume = {93},
  number = {2},
  pages = {353--370}
}
Shannon, C.E. A Mathematical Theory of Communication 2001 SIGMOBILE Mob. Comput. Commun. Rev.
Vol. 5(1), pp. 3-55 
article DOI URL 
BibTeX:
@article{shannon_mathematical_2001,
  author = {Shannon, C. E.},
  title = {A Mathematical Theory of Communication},
  journal = {SIGMOBILE Mob. Comput. Commun. Rev.},
  year = {2001},
  volume = {5},
  number = {1},
  pages = {3--55},
  url = {http://doi.acm.org.ezproxy1.library.usyd.edu.au/10.1145/584091.584093},
  doi = {http://doi.org/10.1145/584091.584093}
}
Shannon, C.E. A Mathematical Theory of Communication 1948 Bell SystemsSchool: Bell Systems Laboratories  techreport URL 
BibTeX:
@techreport{Shannon1948,
  author = {Shannon, C. E.},
  title = {A Mathematical Theory of Communication},
  journal = {Bell Systems},
  school = {Bell Systems Laboratories},
  year = {1948},
  url = {http://cm.bell-labs.com/cS/ms/what/shannonday/paper.html}
}
Klüver, J. A mathematical theory of communication: Meaning, information, and topology 2011 Complexity
Vol. 16(3), pp. 10-26 
article  
Abstract: This article proposes a new mathematical theory of communication. The basic concepts of meaning and information are defined in terms of complex systems theory. Meaning of a message is defined as the attractor it generates in the receiving system; information is defined as the difference between a vector of expectation and one of perception. It can be sown that both concepts are determined by the topology of the receiving system. © 2010 Wiley Periodicals, Inc. Complexity 16: 10–26, 2011
BibTeX:
@article{kluver_mathematical_2011,
  author = {Klüver, Jürgen},
  title = {A mathematical theory of communication: Meaning, information, and topology},
  journal = {Complexity},
  year = {2011},
  volume = {16},
  number = {3},
  pages = {10--26}
}
Shannon, Claude E. A Mathematical theory of Communication: Reprinted with corrections 1998 (50th anniversary release of 1948 paper)). 1998 The Bell System Technical Journal  article URL 
BibTeX:
@article{shannon_claude_e._mathematical_1998,
  author = {Shannon, Claude E.},
  title = {A Mathematical theory of Communication: Reprinted with corrections 1998 (50th anniversary release of 1948 paper)).},
  journal = {The Bell System Technical Journal},
  year = {1998},
  url = {http://cm.bell-labs.com/cS/ms/what/shannonday/paper.html}
}
Fredkin, E. A New Cosmogony 1992 Physics and Computation, 1992. PhysComp '92., Workshop on, pp. 116-121  inproceedings DOI  
BibTeX:
@inproceedings{fredkin_new_1992,
  author = {Fredkin, E.},
  title = {A New Cosmogony},
  booktitle = {Physics and Computation, 1992. PhysComp '92., Workshop on},
  year = {1992},
  pages = {116--121},
  doi = {http://doi.org/10.1109/PHYCMP.1992.615507}
}
Sequoiah-Grayson, S. A Positive Information Logic for Inferential Information 2009 Synthese
Vol. 167(2), pp. 409-431 
article URL 
Abstract: Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.; Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.; Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.; Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.; Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.
BibTeX:
@article{sequoiah-grayson_positive_2009,
  author = {Sequoiah-Grayson, Sebastian},
  title = {A Positive Information Logic for Inferential Information},
  journal = {Synthese},
  year = {2009},
  volume = {167},
  number = {2},
  pages = {409--431},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB4hkFAlVNhC0_CQcgAEiGxj52H7hKqFqhIcegCJm-XYzmlVbbspgn_PjONkd0uR4OjN2PLaX-aReQGUfF7kN3hCw4yVFbOda5VxTLYV42Qe1dJU3hV298v29CWDgixDlGDw6aO61C79MVo8gqGt_GF1mVP3KPKyxlYayIqpRNnQynxyJqBFEKszqlzIWozOzZBBxzhXOcUAKBRtudgRT5FJD4GKuyoo5Yxcr7fcp7cKriCkTh_CGBU3BqdMHutNTv0twdv_-ecfwX5UYrOTAXUzuOMvDmDvfOyK8OsAZpFnrLM3sbD128cgT7LzECP2w2cxDYpgkVG_Z5vhmH6l9EPkOsttiifw7fTT18VZHjs35JajzpJ3RelcVzOEhzdN65lxrSgcbt0qVZVt1VpfNvhICWvKivNCdoVFVFBKRCdMeQgPDEX4X_QhE9AlcK_D19EnJCITPNcE7n9XXz7Ks8-LYTgbh_N1SFebX_YJAiCcTN7MxRFkLbGwzjvv6q7yvjKoX3ZKlkb6xnpWp_BuvHu9Gmp96E1VZwKKpp6dBBQtUkgIHZr4QH9lrKaIeioTpVI4DNc2rTHeGU4ZEaTdcqkRl1T_CbWsFF4NgJrmcL3mutANlQ9CDQ3NWN3_7HGFG3Q1uciYwBXej0jc7D1sOfQZjYAY9r5yXQqv_yAnP1mcwxqBOyDiFF5uw3oiJkna4M1JHqyKFNi_kC1i3Xmqt9A__ds5PYO9wWdHkX7P4W5_de1fDGUyfwMupkmO}
}
Taddeo, M. and Floridi, L. A Praxical Solution of the Symbol Grounding Problem 2007 Minds and Machines
Vol. 17(4), pp. 369-389 
article  
Abstract: This article is the second step in our research into the Symbol Grounding Problem (SGP). In a previous work, we defined the main condition that must be satisfied by any strategy in order to provide a valid solution to the SGP, namely the zero semantic commitment condition (Z condition). We then showed that all the main strategies proposed so far fail to satisfy the Z condition, although they provide several important lessons to be followed by any new proposal. Here, we develop a new solution of the SGP. It is called praxical in order to stress the key role played by the interactions between the agents and their environment. It is based on a new theory of meaning—Action-based Semantics (AbS)—and on a new kind of artificial agents, called two-machine artificial agents (AM²). Thanks to their architecture, AM2s implement AbS, and this allows them to ground their symbols semantically and to develop some fairly advanced semantic abilities, including the development of semantically grounded communication and the elaboration of representations, while still respecting the Z condition.;This article is the second step in our research into the Symbol Grounding Problem (SGP). In a previous work, we defined the main condition that must be satisfied by any strategy in order to provide a valid solution to the SGP, namely the zero semantic commitment condition (Z condition). We then showed that all the main strategies proposed so far fail to satisfy the Z condition, although they provide several important lessons to be followed by any new proposal. Here, we develop a new solution of the SGP. It is called praxical in order to stress the key role played by the interactions between the agents and their environment. It is based on a new theory of meaning–Action-based Semantics (AbS)–and on a new kind of artificial agents, called two-machine artificial agents (AMA2). Thanks to their architecture, A[M.sup.2]s implement AbS, and this allows them to ground their symbols semantically and to develop some fairly advanced semantic abilities, including the development of semantically grounded communication and the elaboration of representations, while still respecting the Z condition.;
BibTeX:
@article{taddeo_praxical_2007,
  author = {Taddeo, Mariarosaria and Floridi, Luciano},
  title = {A Praxical Solution of the Symbol Grounding Problem},
  journal = {Minds and Machines},
  year = {2007},
  volume = {17},
  number = {4},
  pages = {369--389}
}
Bennett, C.H. A Resource-based View of Quantum Information 2004 Quantum Info. Comput.
Vol. 4(6), pp. 460-466 
article URL 
BibTeX:
@article{bennett_resource-based_2004,
  author = {Bennett, Charles H.},
  title = {A Resource-based View of Quantum Information},
  journal = {Quantum Info. Comput.},
  year = {2004},
  volume = {4},
  number = {6},
  pages = {460--466},
  url = {http://dl.acm.org/citation.cfm?id=2011593.2011598}
}
Ubriaco, M.R. A simple mathematical model for anomalous diffusion via Fisher's information theory 2009 Physics Letters A
Vol. 373(44), pp. 4017-4021 
article  
Abstract: Starting with the relative entropy based on a previously proposed entropy function S.sub.q[p]=a'dxp(x)x(-lnp(x)).sup.q, we find the corresponding Fisher's information measure. After function redefinition we then maximize the Fisher information measure with respect to the new function and obtain a differential operator that reduces to a space coordinate second derivative in the q[right arrow]1 limit. We then propose a simple differential equation for anomalous diffusion and show that its solutions are a generalization of the functions in the Barenblatt-Pattle solution. We find that the mean squared displacement, up to a q-dependent constant, has a time dependence according to (x.sup.2)[approximately equal to]K.sup.1/qt.sup.1/q, where the parameter q takes values q=2n-1/2n+1 (superdiffusion) and q=2n+1/2n-1 (subdiffusion), an[greater than or equal to]1.;Starting with the relative entropy based on a previously proposed entropy function Sq[p]=ınt dx p(x)(-\ln p(x))textasciicircumq, we find the corresponding Fisher's information measure. After function redefinition we then maximize the Fisher information measure with respect to the new function and obtain a differential operator that reduces to a space coordinate second derivative in the q\to 1 limit. We then propose a simple differential equation for anomalous diffusion and show that its solutions are a generalization of the functions in the Barenblatt-Pattle solution. We find that the mean squared displacement, up to a q-dependent constant, has a time dependence according to textlessxtextasciicircum2textgreater\sim Ktextasciicircum1/qttextasciicircum1/q, where the parameter q takes values q=\frac2n-12n+1 (superdiffusion) and q=\frac2n+12n-1 (subdiffusion), \forall n\geq 1.;Starting with the relative entropy based on a previously proposed entropy function S [p] = ∫ d x p (x) × (- ln p (x)) , we find the corresponding Fisher's information measure. After function redefinition we then maximize the Fisher information measure with respect to the new function and obtain a differential operator that reduces to a space coordinate second derivative in the q → 1 limit. We then propose a simple differential equation for anomalous diffusion and show that its solutions are a generalization of the functions in the Barenblatt-Pattle solution. We find that the mean squared displacement, up to a q-dependent constant, has a time dependence according to 〈 x 〉 ∼ K t , where the parameter q takes values q = frac(2 n - 1, 2 n + 1) (superdiffusion) and q = frac(2 n + 1, 2 n - 1) (subdiffusion), ∀ n ≥ 1. © 2009 Elsevier B.V. All rights reserved.;Starting with the relative entropy based on a previously proposed entropy function S-q vertical bar P vertical bar = integral dx p(x) x (-In p(x))(q), we find the corresponding Fisher's information measure. After function redefinition we then maximize the Fisher information measure with respect to the new function and obtain a differential operator that reduces to a space coordinate second derivative in the q 1 limit. We then propose a simple differential equation for anomalous diffusion and show that its solutions are a generalization or the functions in the Barenblatt-Pattie solution. We find that the mean squared displacement, Lip to a q-dependent constant, has a time dependence according to textless x(2)textgreater similar to K(1/q)t(1/q), where the parameter q takes values q = 2n-1/sn+1 (superdiffusion) and q = 2n+1/2n-1 (subdiffusion), for all n textgreater= 1. (C) 2009 Elsevier B.V. All rights reserved.;
BibTeX:
@article{ubriaco_simple_2009,
  author = {Ubriaco, Marcelo R.},
  title = {A simple mathematical model for anomalous diffusion via Fisher's information theory},
  journal = {Physics Letters A},
  year = {2009},
  volume = {373},
  number = {44},
  pages = {4017--4021}
}
Chaitin, G. A Theory of Program Size Formally Identical to Information Theory 1975 Journal of the ACM (JACM)
Vol. 22(3), pp. 329-340 
article  
BibTeX:
@article{chaitin_theory_1975,
  author = {Chaitin, Gregory},
  title = {A Theory of Program Size Formally Identical to Information Theory},
  journal = {Journal of the ACM (JACM)},
  year = {1975},
  volume = {22},
  number = {3},
  pages = {329--340}
}
Blass, A. and Gurevich, Y. Abstract Hilbertian deductive systems, infon logic, and Datalog 2013 INFORMATION AND COMPUTATION
Vol. 231, pp. 21-37 
article URL 
Abstract: In the first part of the paper, we discuss abstract Hilbertian deductive systems; these are systems defined by abstract notions of formula, axiom, and inference rule. We use these systems to develop a general method for converting derivability problems, from a broad range of deductive systems, into the derivability problem in a quite specific system, namely the Datalog fragment of universal Horn logic. In this generality, the derivability problems may not be recursively solvable, let alone feasible; in particular, we may get Datalog "programs" with infinitely many rules. We then discuss what would be needed to obtain computationally useful results from this method. In the second part of the paper, we analyze a particular deductive system, primal infon logic with variables, which arose in the development of the authorization language DKAL. A consequence of our analysis of primal infon logic with variables is that its derivability problems can be translated into Datalog with only a quadratic increase of size. (C) 2013 Elsevier Inc. All rights reserved.; In the first part of the paper, we discuss abstract Hilbertian deductive systems; these are systems defined by abstract notions of formula, axiom, and inference rule. We use these systems to develop a general method for converting derivability problems, from a broad range of deductive systems, into the derivability problem in a quite specific system, namely the Datalog fragment of universal Horn logic. In this generality, the derivability problems may not be recursively solvable, let alone feasible; in particular, we may get Datalog "programs" with infinitely many rules. We then discuss what would be needed to obtain computationally useful results from this method. In the second part of the paper, we analyze a particular deductive system, primal infon logic with variables, which arose in the development of the authorization language DKAL. A consequence of our analysis of primal infon logic with variables is that its derivability problems can be translated into Datalog with only a quadratic increase of size. © 2013 Elsevier Inc.
BibTeX:
@article{blass_abstract_2013,
  author = {Blass, A. and Gurevich, Y.},
  title = {Abstract Hilbertian deductive systems, infon logic, and Datalog},
  journal = {INFORMATION AND COMPUTATION},
  year = {2013},
  volume = {231},
  pages = {21--37},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V3fi9QwEA6iIh6i3ql1_QF5EEHuWpqm27QPIueeuqJv3oE-lSRNoIcu5133_3cmP7rdVcFnH9su7TYznXyZfPMNIbzI8nQnJqjOIlSA9ZCpwW3EXAoL4Fxxo-YAWZqdzHYkWW7O_Q-GP1aYvdDD4bL_7ljTEvmuKOuKJCGv3OyMh49YHbrYFzmcJz6ZM0WsoV5piLRl3wZia__-LSLwkR0pR5T-AXWBe99r6tv6sp9mGHxpaAxBTY7UCW9940NkDueKKsjEhhhahFAeo-BkPvWaLr9Fap80OM961JFk3Omo5nwzK8Wd-J3JaqQQonAcLgdfokT6j67Xw2uzSs–oJpAyZDcV33Mxv0kFkq24vuEDWvP9Nv-C1sAJUzTN7FEaH31R3DigMjpfXI3rCDosbf8PrlmVgfkXuzOQUOwPiB7E6nJB-RNdAu6cQs6ugUNbnFEnVNQ5xRHFOxNg0s8JGfv350ulmlonpFqACM8NQj2YDFppJw3XMG6WunC1iWTeWFsVQptdae4qCtmjWacdUYaA9Ed-9iKRvBH5I7EIovV4Ioxu4TcsPBhmARRSgKvnZBbX5vPJ_Xy08If7sfD7MpVDGY_hwQM4L6rtMrEY0IBnFujRK5srcqKM6VlWcEMqxvWdbqzM_IqDn574eVW2shDPG973aKhWmycmvMZSdA6LQ4MDmDLS4Gr-XkDV7zBxntEb5mRF1MLjtcxNcAZjEKNAA5uzf7lZ4sgo4_yEcOTvz70KblduCYqmLh7Rq4Pl2vz3Mt-_gIBJpXN}
}
Lunin, O. and Mathur, S.D. AdS/CFT duality and the black hole information paradox 2002
Vol. 623(1), pp. 342-394 
article  
BibTeX:
@article{lunin_ads/cft_2002,
  author = {Lunin, Oleg and Mathur, Samir D.},
  title = {AdS/CFT duality and the black hole information paradox},
  year = {2002},
  volume = {623},
  number = {1},
  pages = {342--394}
}
Grünwald, P.D., Myung, I.J. and Pitt, M.A. Advances in minimum description length: theory and applications 2005   book URL 
BibTeX:
@book{grunwald_advances_2005,
  author = {Grünwald, Peter D. and Myung, In J. and Pitt, Mark A.},
  title = {Advances in minimum description length: theory and applications},
  publisher = {MIT},
  year = {2005},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV07CwIxDA4-FkXwjU_o5KboXXuto4ji4uYud-0VHbxF_f8mvSoqOoZCQ0vTfEmarwBhMJtPv-4EpQQPrdDWGB1ZzRMexYgNrHT-2H5ltn_FjR8N6M8MBtUMQ44RexGDvDerzLHvUiG4oUiMSNdRU7D0jDtPWXyQ7zmPsm1AiboMmlBIsxbUPRZk3tKuLajuX3yqKFUIE-aUym2YrPK6_ZWdM0bUIJf7hZn0Zf-MPke5nTow3m4O692UdB99muaY-MWIoAu1mJ63ZzfXBmd6wBapllpKk4jAcm0QNSh05TKOjJKGC9GH7u_JBv8GhlBxLKQumzCCssXTno7z_XgAi5R6uA}
}
Floridi, L. Afterword LIS as applied philosophy of information: A reappraisal 2004 Library Trends
Vol. 52(3), pp. 658-667 
article URL 
Abstract: Floridi proposes that library information science should develop its foundation in terms of a philosophy of information (PI). He believes that PI should seek to explain a very wide range of phenomena and practices because its aim is foundationalist.; Floridi proposes that library information science should develop its foundation in terms of a philosophy of information (PI). He believes that PI should seek to explain a very wide range of phenomena and practices because its aim is foundationalist.
BibTeX:
@article{floridi_afterword_2004,
  author = {Floridi, Luciano},
  title = {Afterword LIS as applied philosophy of information: A reappraisal},
  journal = {Library Trends},
  year = {2004},
  volume = {52},
  number = {3},
  pages = {658--667},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV3di9QwEA_qgQiinnp19Q7ycPjWbpu2SSuIlL09bvF8UsG3kCbpcbjs1k0X9b93pl-73vlw4EshJP3KTGYmk5nfEBKzIPRvyIQwKZOMx1ZwAyqL5ymsMQ3aj8dcxCZWf3u2ydnoyVh9D1BS7lIcVF27qVnraYGmcYzwaNN5sfj0Ya-gxvY9JqvfJweg7RIE1F_Mi514ZkkPzZz4YPsnt2Uxoobqdb11_1RGreI5f0oG_9MQcDKeQu_y5G8HZP_XDz0jT3orlRYdWx2Se3b1nJz0OQ70Le2TmJCotJcOL0hRYL3xn7CbpZeLz1Q5qjoTl9ZDvYTfdF3R693N76iiYLXW9UZdO7V8Sb6ez7_MLvy-QoN_FXGQTkDPVFmjDBM6S5mGb8_yLFG4aYm4ZaFNw8imXKtECB5qU5kyyoRVaWmFSaP4iDxWGMm_atqMP-ORgwqWnfVQFXow1x55-C2_PMsuPs665uHQDFyblhb8aDwgbTtPPg_EK0KrTOVxoizuboHvcoUxtKKsKq1TXfFqQk6RBLIv8wkXh44Qd6W2zsmRChPitcNwVpqN0vs9xwOvSFWiS0o3TjIGzIapvBNyNHab5VLi2THiCkLHacdYsu6gRCSTjslQshwMpQxWA5PNrwZefGNYnIEhlwoBD9hnyLG_hTECwxSLFYCInZDoLsNmPQo8oh80r-82KW_Ioy52CZ1Qx-RBs9nakw7C8g_KSC74}
}
Floridi, L. Against Digital Ontology 2009 Synthese
Vol. 168(1), pp. 151-178 
article  
Abstract: The paper argues that digital ontology (the ultimate nature of reality is digital, and the universe is a computational system equivalent to a Turing Machine) should be carefully distinguished from informational ontology (the ultimate nature of reality is structural), in order to abandon the former and retain only the latter as a promising line of research. Digital vs. analogue is a Boolean dichotomy typical of our computational paradigm, but digital and analogue are only "modes of presentation" of Being (to paraphrase Kant), that is, ways in which reality is experienced or conceptualised by an epistemic agent at a given level of abstraction. A preferable alternative is provided by an informational approach to structural realism, according to which knowledge of the world is knowledge of its structures. The most reasonable ontological commitment turns out to be in favour of an interpretation of reality as the totality of structures dynamically interacting with each other. The paper is the first part (the pars destruens) of a two-part piece of research. The pars construens, entitled "A Defence of Informational Structural Realism", is developed in a separate article, also published in this journal.;The paper argues that digital ontology (the ultimate nature of reality is digital, and the universe is a computational system equivalent to a Turing Machine) should be carefully distinguished from informational ontology (the ultimate nature of reality is structural), in order to abandon the former and retain only the latter as a promising line of research. Digital vs. analogue is a Boolean dichotomy typical of our computational paradigm, but digital and analogue are only “modes of presentation” of Being (to paraphrase Kant), that is, ways in which reality is experienced or conceptualised by an epistemic agent at a given level of abstraction. A preferable alternative is provided by an informational approach to structural realism, according to which knowledge of the world is knowledge of its structures. The most reasonable ontological commitment turns out to be in favour of an interpretation of reality as the totality of structures dynamically interacting with each other. The paper is the first part (the pars destruens) of a two-part piece of research. The pars construens, entitled “A Defence of Informational Structural Realism”, is developed in a separate article, also published in this journal.;The paper argues that digital ontology (the ultimate nature of reality is digital, and the universe is a computational system equivalent to a Turing Machine) should be carefully distinguished from informational ontology (the ultimate nature of reality is structural), in order to abandon the former and retain only the latter as a promising line of research. Digital vs. analogue is a Boolean dichotomy typical of our computational paradigm, but digital and analogue are only "modes of presentation" of Being (to paraphrase Kant), that is, ways in which reality is experienced or conceptualised by an epistemic agent at a given level of abstraction. A preferable alternative is provided by an informational approach to structural realism, according to which knowledge of the world is knowledge of its structures. The most reasonable ontological commitment turns out to be in favour of an interpretation of reality as the totality of structures dynamically interacting with each other. The paper is the first part (the pars destruens) of a two-part piece of research. The pars construens, entitled "A Defence of Informational Structural Realism", is developed in a separate article, also published in this journal.;The paper argues that digital ontology (the ultimate nature of reality is digital, and the universe is a computational system equivalent to a Turing Machine) should be carefully distinguished from informational ontology (the ultimate nature of reality is structural), in order to abandon the former and retain only the latter as a promising line of research. Digital vs. analogue is a Boolean dichotomy typical of our computational paradigm, but digital and analogue are only "modes of presentation" of Being (to paraphrase Kant), that is, ways in which reality is experienced or conceptualised by an epistemic agent at a given level of abstraction. A preferable alternative is provided by an informational approach to structural realism, according to which knowledge of the world is knowledge of its structures. The most reasonable ontological commitment turns out to be in favour of an interpretation of reality as the totality of structures dynamically interacting with each other. The paper is the first part (the pars destruens) of a two-part piece of research. The pars construens, entitled "A Defence of Informational Structural Realism", is developed in a separate article, also published in this journal.;
BibTeX:
@article{floridi_against_2009,
  author = {Floridi, Luciano},
  title = {Against Digital Ontology},
  journal = {Synthese},
  year = {2009},
  volume = {168},
  number = {1},
  pages = {151--178}
}
Zurek, W.H. Algorithmic information content, Church-Turing thesis, physical entropy, and Maxwell's demon 1990   inproceedings  
Abstract: Measurements convert alternative possibilities of its potential outcomes into the definiteness of the record” – data describing the actual outcome. The resulting decrease of statistical entropy has been, since the inception of the Maxwell's demon, regarded as a threat to the second law of thermodynamics. For, when the statistical entropy is employed as the measure of the useful work which can be extracted from the system, its decrease by the information gathering actions of the observer would lead one to believe that, at least from the observer's viewpoint, the second law can be violated. I show that the decrease of ignorance does not necessarily lead to the lowering of disorder of the measured physical system. Measurements can only convert uncertainty (quantified by the statistical entropy) into randomness of the outcome (given by the algorithmic information content of the data). The ability to extract useful work is measured by physical entropy, which is equal to the sum of these two measures of disorder. So defined physical entropy is, on the average, constant in course of the measurements carried out by the observer on an equilibrium system. 27 refs., 6 figs.
BibTeX:
@inproceedings{zurek_algorithmic_1990,
  author = {Zurek, W. H.},
  title = {Algorithmic information content, Church-Turing thesis, physical entropy, and Maxwell's demon},
  publisher = {DOE/AD},
  year = {1990}
}
Chaitin, G.J. Algorithmic information theory 1987
Vol. 1 
book  
BibTeX:
@book{chaitin_algorithmic_1987,
  author = {Chaitin, Gregory J.},
  title = {Algorithmic information theory},
  publisher = {Cambridge University Press},
  year = {1987},
  volume = {1}
}
Grünwald, P.D. and Vitányi, P.M.B. Algorithmic Information Theory 2008 , pp. 281-317  incollection  
BibTeX:
@incollection{grunwald_algorithmic_2008,
  author = {Grünwald, Peter D. and Vitányi, Paul M. B.},
  title = {Algorithmic Information Theory},
  year = {2008},
  pages = {281--317}
}
Chen, M. and Floridi, L. An analysis of information visualisation 2013 Synthese
Vol. 190(16), pp. 3421-3438 
article URL 
Abstract: Philosophers have relied on visual metaphors to analyse ideas and explain their theories at least since Plato. Descartes is famous for his system of axes, and Wittgenstein for his first design of truth table diagrams. Today, visualisation is a form of ‘computer-aided seeing’ information in data. Hence, information is the fundamental ‘currency’ exchanged through a visualisation pipeline. In this article, we examine the types of information that may occur at different stages of a general visualization pipeline. We do so from a quantitative and a qualitative perspective. The quantitative analysis is developed on the basis of Shannon’s information theory. The qualitative analysis is developed on the basis of Floridi’s taxonomy in the philosophy of information. We then discuss in detail how the condition of the ‘data processing inequality’ can be broken in a visualisation pipeline. This theoretic finding underlines the usefulness and importance of visualisation in dealing with the increasing problem of data deluge. We show that the subject of visualisation should be studied using both qualitative and quantitative approaches, preferably in an interdisciplinary synergy between information theory and the philosophy of information.;Philosophers have relied on visual metaphors to analyse ideas and explain their theories at least since Plato. Descartes is famous for his system of axes, and Wittgenstein for his first design of truth table diagrams. Today, visualisation is a form of 'computer-aided seeing' information in data. Hence, information is the fundamental 'currency' exchanged through a visualisation pipeline. In this article, we examine the types of information that may occur at different stages of a general visualization pipeline. We do so from a quantitative and a qualitative perspective. The quantitative analysis is developed on the basis of Shannon's information theory. The qualitative analysis is developed on the basis of Floridi's taxonomy in the philosophy of information. We then discuss in detail how the condition of the 'data processing inequality' can be broken in a visualisation pipeline. This theoretic finding underlines the usefulness and importance of visualisation in dealing with the increasing problem of data deluge. We show that the subject of visualisation should be studied using both qualitative and quantitative approaches, preferably in an interdisciplinary synergy between information theory and the philosophy of information.; Philosophers have relied on visual metaphors to analyse ideas and explain their theories at least since Plato. Descartes is famous for his system of axes, and Wittgenstein for his first design of truth table diagrams. Today, visualisation is a form of 'computer-aided seeing' information in data. Hence, information is the fundamental 'currency' exchanged through a visualisation pipeline. In this article, we examine the types of information that may occur at different stages of a general visualization pipeline. We do so from a quantitative and a qualitative perspective. The quantitative analysis is developed on the basis of Shannon's information theory. The qualitative analysis is developed on the basis of Floridi's taxonomy in the philosophy of information. We then discuss in detail how the condition of the 'data processing inequality' can be broken in a visualisation pipeline. This theoretic finding underlines the usefulness and importance of visualisation in dealing with the increasing problem of data deluge. We show that the subject of visualisation should be studied using both qualitative and quantitative approaches, preferably in an interdisciplinary synergy between information theory and the philosophy of information.[PUBLICATION ABSTRACT];
BibTeX:
@article{chen_analysis_2013,
  author = {Chen, Min and Floridi, Luciano},
  title = {An analysis of information visualisation},
  journal = {Synthese},
  year = {2013},
  volume = {190},
  number = {16},
  pages = {3421--3438},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3JTsMwEB2hnnoBWraySD6hIpQqXuIkxwpRcURsV8t1bISo0qotSP17xo3TCsIBrslkG8fz3ngWA3A2iKMfNiFHYm-yxGoqqGYuR1jQdCwsT7i0rKDfV7aBbVYyyvdBHaBc2-1t6RtlzGf6-MSCjEcrtMIIVT6p7-HxZRNHQGcgNGbMozRL0jqu-dsdmka5ER1dg85oD-pimDrZZBOB3tbIN5Ox__Ex-7Ab-CgZVj9QB3Zs2YX2fb3BwaoLnTD9F6QfelRfHUB_WBId-pmQqSOh_6ofZfL5tvCVmlWa0CE8j26fbu6isOlC9IqeIMdZ43LDEbWNzTmV0llkVAikwpeculQ6JozOkBaKxCCXknHqEPRNghhXaJdn_Aha5bS0J0AKYeJijIwzzpyguRhriYTBUod2xeCI9ODYK1z5V1zOtVHce2v4bIln6jFQxWSiOLKSLEGySHtwXatQzap-HGqtPYVMhqmgLYVaVLPC9eCyIY1yPFyCZEhR6YVP_yp4Bm1_vKpGPIfWcv5hL6oWjl85WdOw}
}
Tononi, G. An information integration theory of consciousness 2004 BMC Neuroscience
Vol. 5(1), pp. 42 
article DOI URL 
Abstract: Consciousness poses two main problems. The first is understanding the conditions that determine to what extent a system has conscious experience. For instance, why is our consciousness generated by certain parts of our brain, such as the thalamocortical system, and not by other parts, such as the cerebellum? And why are we conscious during wakefulness and much less so during dreamless sleep? The second problem is understanding the conditions that determine what kind of consciousness a system has. For example, why do specific parts of the brain contribute specific qualities to our conscious experience, such as vision and audition?
BibTeX:
@article{tononi_information_2004,
  author = {Tononi, Giulio},
  title = {An information integration theory of consciousness},
  journal = {BMC Neuroscience},
  year = {2004},
  volume = {5},
  number = {1},
  pages = {42},
  url = {http://dx.doi.org/10.1186/1471-2202-5-42},
  doi = {http://doi.org/10.1186/1471-2202-5-42}
}
Millikan, R. An Input Condition for Teleosemantics? Reply to Shea (and Godfrey-Smith) 2007 Philosophy and Phenomenological Research
Vol. 75, pp. 436-455 
article  
BibTeX:
@article{Millikan2007,
  author = {Millikan, R.},
  title = {An Input Condition for Teleosemantics? Reply to Shea (and Godfrey-Smith)},
  journal = {Philosophy and Phenomenological Research},
  year = {2007},
  volume = {75},
  pages = {436-455}
}
Borrill, P.L. An Insight into Information, Entanglement and Time 2015 It From Bit or Bit From It?, pp. 97-112  incollection  
BibTeX:
@incollection{borrill_insight_2015,
  author = {Borrill, Paul L},
  title = {An Insight into Information, Entanglement and Time},
  booktitle = {It From Bit or Bit From It?},
  publisher = {Springer},
  year = {2015},
  pages = {97--112}
}
Susskind, L. and Lindesay, J. An introduction to black holes, information and the string theory revolution: the holographic universe 2005   book  
BibTeX:
@book{susskind_introduction_2005,
  author = {Susskind, Leonard and Lindesay, James},
  title = {An introduction to black holes, information and the string theory revolution: the holographic universe},
  publisher = {World Scientific},
  year = {2005}
}
Fredkin, E. An Introduction to Digital Philosophy 2003 International Journal of Theoretical Physics
Vol. 42(2), pp. 189-247 
article  
Abstract: Digital Philosophy (DP) is a new way of thinking about how things work. This paper can be viewed as a continuation of the author's work of 1990[3]; it is based on the general concept of replacing normal mathematical models, such as partial differential equations, with Digital Mechanics (DM). DP is based on two concepts: bits, like the binary digits in a computer, correspond to the most microscopic representation of state information; and the temporal evolution of state is a digital informational process similar to what goes on in the circuitry of a computer processor. We are motivated in this endeavor by the remarkable clarification that DP seems able to provide with regard to many of the most fundamental questions about processes we observe in our world.
BibTeX:
@article{fredkin_introduction_2003,
  author = {Fredkin, Edward},
  title = {An Introduction to Digital Philosophy},
  journal = {International Journal of Theoretical Physics},
  year = {2003},
  volume = {42},
  number = {2},
  pages = {189--247}
}
Grazioso, F. An introduction to information theory and some of its applications: black hole information paradox and renormalization group information flow 2015 Canadian Journal of Physics
Vol. 93(9) 
article  
BibTeX:
@article{grazioso_introduction_2015,
  author = {Grazioso, Fabio},
  title = {An introduction to information theory and some of its applications: black hole information paradox and renormalization group information flow},
  journal = {Canadian Journal of Physics},
  year = {2015},
  volume = {93},
  number = {9}
}
Vitanyi, P.M.B. and Li, M. An Introduction to Komogorov Complexity and its Applications 2009   book URL 
BibTeX:
@book{vitanyi_introduction_2009,
  author = {Vitanyi, Paul M. B. and Li, Ming},
  title = {An Introduction to Komogorov Complexity and its Applications},
  publisher = {Springer Science & Business Media},
  year = {2009},
  url = {https://books.google.com.au/books?id=25fue3UYDN0C}
}
Carnap, R. and Bar-Hillel, Y. An Outline of a Theory of Semantic Information 1952 (247)School: RESEARCH LABORATORY OF ELECTRONICS MASSACHUSETTS INSTITUTE OF TECHNOLOGY  techreport  
BibTeX:
@techreport{carnap_outline_1952,
  author = {Carnap, Rudolf and Bar-Hillel, Yehoshua},
  title = {An Outline of a Theory of Semantic Information},
  school = {RESEARCH LABORATORY OF ELECTRONICS MASSACHUSETTS INSTITUTE OF TECHNOLOGY},
  year = {1952},
  number = {247}
}
Godfrey-Smith, P. Animal Communication Theory: Information and Influence 2012   inbook  
BibTeX:
@inbook{Godfrey-Smith2012,
  author = {Godfrey-Smith, P.},
  title = {Animal Communication Theory: Information and Influence},
  publisher = {Cambridge University Press},
  year = {2012}
}
Adriaans, P., Adriaans, P., Vitanyi, P.M.B. and Vitanyi, P. Approximation of the Two-Part MDL Code 2009 IEEE Transactions on Information Theory
Vol. 55(1), pp. 444-457 
article URL 
Abstract: Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: (i) computation of each step may take arbitrarily long; (ii) we may not know when we reach the optimum, or whether we will reach the optimum at all; (iii) the sequence of models generated may not monotonically improve the goodness of fit; but (iv) the model associated with the optimum has (almost) the best goodness of fit. To express the practically interesting goodness of fit of individual models for individual data sets we have to rely on Kolmogorov complexity.; Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: i) computation of each step may take arbitrarily long; ii) we may not know when we reach the optimum, or whether we will reach the optimum at all; iii) the sequence of models generated may not monotonically improve the goodness of fit; but iv) the model associated with the optimum has (almost) the best goodness of fit. To express the practically interesting goodness of fit of individual models for individual data sets we have to rely on Kolmogorov complexity.; Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: (i) computation of each step may take arbitrarily long; (ii) we may not know when we reach the optimum, or whether we will reach the optimum at all; (iii) the sequence of models generated may not monotonically improve the goodness of fit; but (iv) the model associated with the optimum has (almost) the best goodness of fit. To express the practically interesting goodness of fit of individual models for individual data sets we have to rely on Kolmogorov complexity.; Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: i) computation of each step may take arbitrarily long; ii) we may not know when we reach the optimum, or whether we will reach the optimum at all; iii) the sequence of models generated may not monotonically improve the goodness of fit; but iv) the model associated with the optimum has (almost) the best goodness of fit. To express the practically interesting goodness of fit of individual models for individual data sets we have to rely on Kolmogorov complexity.; Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: i) computation of each step may take arbitrarily long; ii) we may not know when we reach the optimum, or whether we will reach the optimum at all; iii) the sequence of models generated may not monotonically improve the goodness of fit; but iv) the model associated with the optimum has (almost) the best goodness of fit. To express the practically interesting goodness of fit of individual models for individual data sets we have to rely on Kolmogorov complexity.; The authors describe the approximation of an optimal 2-part minimum description length (MDL) code for given data through successive monotonically length-decreasing 2-part MDL codes. Calculation of each step may take arbitrarily long. The sequence of models generated may not monotonically enhance the goodness of fit. The model related to the optimum has almost the best goodness of fit. Kolmogorov complexity is used to express the pratically interesting good of fit of individual models for individual data sets.; Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: i) computation of each step may take arbitrarily long; ii) we may not know when we reach the optimum, or whether we will reach the optimum at all; iii) the sequence of models generated may not monotonically improve the goodness of fit; but iv) the model associated with the optimum has (almost) the best goodness of fit. To express the practically interesting goodness of fit of individual models for individual data sets we have to rely on Kolmogorov complexity. [PUBLICATION ABSTRACT]
BibTeX:
@article{adriaans_approximation_2009,
  author = {Adriaans, Pieter and Adriaans, P. and Vitanyi, Paul M. B. and Vitanyi, P.},
  title = {Approximation of the Two-Part MDL Code},
  journal = {IEEE Transactions on Information Theory},
  year = {2009},
  volume = {55},
  number = {1},
  pages = {444--457},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lb9QwELYQpyJEoTwaaKVIvMRhW7-SOIceqoUKJJA4LGcrfkSqCLvVJkv35zNjJ-nuAgL1kocysRN77Bl7Zr4hRPATOtmZEyzIgYwXvGalF8rlHBT1GmQ9SHcjREjmsLGzTYYsi5jvJLii-RO8DJZ9uG9PK9PqqmkCrGC17JNnSFAUiwIDzhkrWIzlGi0KMmMROZzBAIc1yWCypOXp7NMsOlXigWV8S0T1E3VIvRLgNjEP0rY6ivEjq_aPkitIqYt9MlisB–U0WR9E1T_u_f2rf7-IXnQq7LpeeS9R-SOnx-Q_SFNRNrPGgfk3gbm4WPy5hxRzNeXMWQyXdQpqKDp7Hox-QpcnH55_zmdLpx_Qr5dfJhNP076ZA0TK4QoJzXz3ChprMgMy6mRRgnjlOSVZblXXtjSgOTzzrpalNSComQqYZWoTY2mV_GU3K_QqX_eheA_d0hSdNmpbSadFFJyCpLTWSNL5b1SdalkQt4NHaWvIjiHDosaWmro1Jhis-_UhDzDntQ4brtlZTWotmEbDko5xuYdS6BBcc6lok4pqDchZ1vP_1aDDlTrH40O9mF9VX3XIPrL27wPnxXeV1zA972MXDeWYK8vNcKC43newOpNYS1vd6hWPysY0oEQLpEwRFHD_xwNzKuBk7AxulYjyJBQtKTQUONj10DpiGCYgdrHEvI6svpYBdct11RnChbxTIImzHW37hJyuEOHm4sFTABQ-KvNQXLT5rgXAMq9ykO0eELY_5BNexh7hG_onv-jG1-QvWgJxO2zI3K3W678cQTf_AWeeGar}
}
Hawkins, R.J. and Frieden, B.R. Asymmetric information and quantization in financial economics 2012 International Journal of Mathematics and Mathematical Sciences
Vol. 2012 
article URL 
BibTeX:
@article{hawkins_asymmetric_2012,
  author = {Hawkins, Raymond J. and Frieden, B. R.},
  title = {Asymmetric information and quantization in financial economics},
  journal = {International Journal of Mathematics and Mathematical Sciences},
  year = {2012},
  volume = {2012},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV07T8MwELYQEwghnmp5SFkYA37G9oJUEFUXFmBgixzbt7UU0krw7zk3JoJIrHGS4bN935199x0hgl_TcmATANDPptwBqGCZDRWVvmFB6yAMCN_8Pdnuc2oGF_rIfRisM34jNUWaQsOL7JR249Pstb86QMbr8hYrVjLNeC7KG3ybJfqTQKh_W67bX6QyPSD72RssJt30HZKtuDgiu4-9lGp7TG4n7dd8nvpe-SKrnCYsC7cIxfsaccmFlDhYwI98RhFzvXF7Qp6nDy_3szI3PSgDUnNpKTQsMudDbKwC3G4UpJGMOwrCRBMjWA1KgnJSeaV1BaKpMMpEx84LK07Jnkup8YvVpoQujEgBHiT1obIGogzSNFZGpXQTNXNMeTkmdwmKetnJW9RJcHrzAGGv8_qtpZDWBIynonNSBGODE2igaIwNOpCUj8lVB2T_G163vKa1wYiIM6RJQ-vV52pMRoP3RGr-IdDmnv0_dE520tx1Rx8XZHv1sY6XnXDiN5hrrg0}
}
Rybakov, V.V. Barwise's information frames and modal logics 2002 Algebra and Logic
Vol. 41(5), pp. 323-336 
article URL 
BibTeX:
@article{rybakov_barwises_2002,
  author = {Rybakov, V. V.},
  title = {Barwise's information frames and modal logics},
  journal = {Algebra and Logic},
  year = {2002},
  volume = {41},
  number = {5},
  pages = {323--336},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V1LS8QwEA6iKIqIz64v6EU8SJc0SZP24MEngnpyBW-lTSci6q7a-th_bx7dbl30D3hMW0I7X_rNTJL5ghAlXRxMcAKNpMhFARgIlhLHOaVRTuKcFxmLsAx_zmw3eszja_8B-OPs7fPBHJco7FarUXHigTLbsJwi8_Og0MhY1ivb0enR071ZR7bPmEOYZbMcM8yzx8GH3RRbV6aMZgpIa6ZgzH468XTsBzXhCRoYDbA2I7KwhXzUojfqaoNrT0npLyLWE86l2fIXUp3ZRLHuwGiaPxcPsjqEfnB7o92njkoNOR1fXo31knmMR5mLeWWj5CoHL-9ly_v3ltFSHbb7R87cK2gK-qto9sxKfg9X0cJ1o35brqGghmC_9FsA-A4AXxvXtwD4DoB11Ds_651cBPWxFEHJKQ8yJoATYkJBRouEKh1BKqIDZ6VTW5ElMZA8YaAAVKFCqYDpjJNJmkSQc8Uk2UCLmale6Fe2yrHw0HT19g6e8f6e_jIPzd0lV6fxxeWJa66Mmt3SVuJ1XytP28mO14B3RQf5Cc3ySHEO3GjWYZkRQUDHakqGEQiON1HHmS99cSImaYPHJtqbuEXSkqQ4ZTq6S4xsLA_T6qva-ruLbTQ_Hm87aEbpHwh2nbDlN6QyO88}
}
Taylor, K.A. Belief, Information and Semantic Content: A Naturalist's Lament 1987 Synthese
Vol. 71(1), pp. 97-124 
article URL 
BibTeX:
@article{taylor_belief_1987,
  author = {Taylor, Kenneth A.},
  title = {Belief, Information and Semantic Content: A Naturalist's Lament},
  journal = {Synthese},
  year = {1987},
  volume = {71},
  number = {1},
  pages = {97--124},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlZ1LSwMxEMcH6akXtdVqfUDAgwqu7m6STeqltGLxICLY-5LNA4S6trYe_PZm9uX74H0DSyaZSWb–Q0Ajc_D4JtPcEiNCaWKhOWJzhhlWg1CKlU8iDLl3NfMdpPJQJFloRIsavr-uJTN7AWGLIyzw_kiwO5RWGWtWml4V4yIsrKVeVNM8DeCis44CITEQPQpBJUqxB-euAgvkw2o9Wy1rKSpNX-8hv9Fdv3P396E9er4SUbleunAms270L6v-xm8daFT7fYlOamQ1KdbMBxbf1Z1Z6R6vYTWJCo35ME-edM8alJQrvLVJRmRO1WwPPz6OV6SW4Xpx22YTq6nVzdB1Xoh0BFqDF3IMyW536JZIpWNaWYsY7HQjhqhjdV-LhMZIS_OX3kMt0oZ5PZwJZTjnPaglT_ndheIcfh4V_CEZZJFWijLhKbS2URrSxPdh6PaGOm8BGykNUp5PCmIgIyKPvSKCW0-qWezDzu14VIzm6URFix44n383l9D9qGNmZZSfnMArdXLqz0sKYzv7IXM-A}
}
Braunstein, S.L., Pirandola, S. and Zyczkowski, K. Better late than never: Information retrieval from black holes 2013 Physical Review Letters
Vol. 110(10), pp. 101301 
article URL 
Abstract: We show that, in order to preserve the equivalence principle until late times in unitarily evaporating black holes, the thermodynamic entropy of a black hole must be primarily entropy of entanglement across the event horizon. For such black holes, we show that the information entering a black hole becomes encoded in correlations within a tripartite quantum state, the quantum analogue of a one-time pad, and is only decoded into the outgoing radiation very late in the evaporation. This behavior generically describes the unitary evaporation of highly entangled black holes and requires no specially designed evolution. Our work suggests the existence of a matter-field sum rule for any fundamental theory. DOI: 10.1103/PhysRevLett.110.101301; We show that, in order to preserve the equivalence principle until late times in unitarily evaporating black holes, the thermodynamic entropy of a black hole must be primarily entropy of entanglement across the event horizon. For such black holes, we show that the information entering a black hole becomes encoded in correlations within a tripartite quantum state, the quantum analogue of a one-time pad, and is only decoded into the outgoing radiation very late in the evaporation. This behavior generically describes the unitary evaporation of highly entangled black holes and requires no specially designed evolution. Our work suggests the existence of a matter-field sum rule for any fundamental theory.
BibTeX:
@article{braunstein_better_2013,
  author = {Braunstein, Samuel L. and Pirandola, Stefano and Zyczkowski, Karol},
  title = {Better late than never: Information retrieval from black holes},
  journal = {Physical Review Letters},
  year = {2013},
  volume = {110},
  number = {10},
  pages = {101301},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8QwEB5EFATx_agPyEGP6-bVZOtBUFH2oB5EwVtpkva4iNsVf74zabeuL9BDL-lQmEw6j-T7JgBKnvDeF59gg7OVGQQuXFb5Al2kxqWi0ENn3kqvPu9sA__5QF9w1Sdg5H35SmQXGqDqU0X-lsZEnCgct0-dJxbSNJ5YEQyB25Yh_PtnPgWnLiItEEFkMv4xNMUwdL0KUyzyFH7SnUl_sOa_w7P_rN4arLT5KTtvFtQ6zJWjDViMOFE_3oSzi0j_YTeYojLadWd3Jf4Mp6ylNZGZ2X28pQuXMCPyCot7hGxInaO24PH66uFy2GuvYOgVwljT817aVDqLeVBlMq81PsGlIR0UwmOphDMsQsGrKitChtWN9ehAszIzpS64k4XahuWCoPqjOlL6wi4w73QqShdSrSptuXHGhYERVRVSXhqeJdCfGiF_blpu5LFU4SqfmR4ayJvpSWCnsVUnLxVxkrVN4Lgx3sebfCxzng-wUNMmNhzL67c6gd0vcspgZoYKqQSOZs3eCVA1KVKrUQWK-QmIv4hdtp3XqeNAvfdvTfdhScb7OAgEdwDz9cukPGw6SL4DGAwC9w}
}
Adriaans, P. Between order and chaos: The quest for meaningful information 2009 Theory of Computing Systems
Vol. 45(4), pp. 650-674 
article URL 
Abstract: The notion of meaningful information seems to be associated with the sweet spot between order and chaos. This form of meaningfulness of information, which is primarily what science is interested in, is not captured by both Shannon information and Kolmogorov complexity. In this paper I develop a theoretical framework that can be seen as a first approximation to a study of meaningful information. In this context I introduce the notion of facticity of a data set. I discuss the relation between thermodynamics and algorithmic complexity theory in the context of this problem. I prove that, under adequate measurement conditions, the free energy of a system in the world is associated with the randomness deficiency of a data set with observations about this system. These insights suggest an explanation of the efficiency of human intelligence in terms of helpful distributions. Finally I give a critical discussion of Schmidhuber’s views specifically his notion of low complexity art, I defend the view that artists optimize facticity instead.; The notion of meaningful information seems to be associated with the sweet spot between order and chaos. This form of meaningfulness of information, which is primarily what science is interested in, is not captured by both Shannon information and Kolmogorov complexity. In this paper I develop a theoretical framework that can be seen as a first approximation to a study of meaningful information. In this context I introduce the notion of facticity of a data set. I discuss the relation between thermodynamics and algorithmic complexity theory in the context of this problem. I prove that, under adequate measurement conditions, the free energy of a system in the world is associated with the randomness deficiency of a data set with observations about this system. These insights suggest an explanation of the efficiency of human intelligence in terms of helpful distributions. Finally I give a critical discussion of Schmidhuber's views specifically his notion of low complexity art, I defend the view that artists optimize facticity instead. [PUBLICATION ABSTRACT]; The notion of meaningful information seems to be associated with the sweet spot between order and chaos. This form of meaningfulness of information, which is primarily what science is interested in, is not captured by both Shannon information and Kolmogorov complexity. In this paper I develop a theoretical framework that can be seen as a first approximation to a study of meaningful information. In this context I introduce the notion of facticity of a data set. I discuss the relation between thermodynamics and algorithmic complexity theory in the context of this problem. I prove that, under adequate measurement conditions, the free energy of a system in the world is associated with the randomness deficiency of a data set with observations about this system. These insights suggest an explanation of the efficiency of human intelligence in terms of helpful distributions. Finally I give a critical discussion of Schmidhuber's views specifically his notion of low complexity art, I defend the view that artists optimize facticity instead.; The notion of meaningful information seems to be associated with the sweet spot between order and chaos. This form of meaningfulness of information, which is primarily what science is interested in, is not captured by both Shannon information and Kolmogorov complexity. In this paper I develop a theoretical framework that can be seen as a first approximation to a study of meaningful information. In this context I introduce the notion of facticity of a data set. I discuss the relation between thermodynamics and algorithmic complexity theory in the context of this problem. I prove that, under adequate measurement conditions, the free energy of a system in the world is associated with the randomness deficiency of a data set with observations about this system. These insights suggest an explanation of the efficiency of human intelligence in terms of helpful distributions. Finally I give a critical discussion of Schmidhuber's views specifically his notion of low complexity art, I defend the view that artists optimize facticity instead.; The notion of meaningful information seems to be associated with the sweet spot between order and chaos. This form of meaningfulness of information, which is primarily what science is interested in, is not captured by both Shannon information and Kolmogorov complexity. In this paper I develop a theoretical framework that can be seen as a first approximation to a study of meaningful information. In this context I introduce the notion of facticity of a data set. I discuss the relation between thermodynamics and algorithmic complexity theory in the context of this problem. I prove that, under adequate measurement conditions, the free energy of a system in the world is associated with the randomness deficiency of a data set with observations about this system. These insights suggest an explanation of the efficiency of human intelligence in terms of helpful distributions. Finally I give a critical discussion of Schmidhuber's views specifically his notion of low complexity art, I defend the view that artists optimize facticity instead. [PUBLICATION ABSTRACT]
BibTeX:
@article{adriaans_between_2009,
  author = {Adriaans, P.},
  title = {Between order and chaos: The quest for meaningful information},
  journal = {Theory of Computing Systems},
  year = {2009},
  volume = {45},
  number = {4},
  pages = {650--674},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Li9RAEC58HRRxdX1FV-iDIgjR9CsdBRF3cPGyeFHw1nSnO3gYsusms-z-e6s6SeusCuItkJBMpipVX72-ApDiZVVesAkeo4aqIzI6RAQ61rWr6-ibQIRiwWi_ndmGhRVilvZiJJPlDkctJc1fCWnIt2n-7vh7SVukqNo6r9S4DFc5TaWifpuv-9kyyyaNpCBEoEkhnaucVSIVRWdWUqkAAxhZnm_5qdlaX-tpv8-wDUZpemQz_NFvJR91sANL09TSm5IL1j9H6n_v3f7fd78Nt2Ysy95PyncHLsV-F3aWPRFsNhu7cOMwc8MOd-Ht_tQaxj4R6ydzfWCrb-5oeMNQZVlKwDJE0uwwOsrZdJs1m2emSIfuwZeDD59XH8t5iUPZyoYCXf_aaNOhGcC_uVNROk6MNEZ52UTBnZFOuMo7bmKnlAmya41UogpKhaY1Ud6Hm46a_fsxDQWGh8BQcygwVR2CKRVN8NGL4JVTXJnaaVXAi0V29ngi7bCZnjkJ2qKgLQnanhfwgKRr6YMeT1xrBe3yIl6mAp5PAs832Zw6_EosUXLjoe3XFuFXw6sC9hY5WecpR9WOg81Swkfk02G9toiqEDzQrFoBzyb9yY8QdhC2soZ4gzQCKMXteDbiHS5cpyn4VY0u4OmvipfPk6tLuwUUQRJRAP-Xy1YzMTwRIoyP_vqzH8P1VFVLM5l7cGU82cQnE5HlD3SQMbc}
}
Gupta, A., Condit, C. and Qian, X. BioDB: An ontology-enhanced information system for heterogeneous biological information 2010 Data & Knowledge Engineering
Vol. 69(11), pp. 1084-1102 
article URL 
Abstract: This paper presents BIODB, an ontology-enhanced information system to manage heterogeneous data. An ontology-enhanced system is a system where ad hoc data is imported into the system by a user, annotated by the user to connect the data to an ontology or other data sources, and then all data connected through the ontology can be queried in a federated manner. The BIODB system enables multi-model data federation. i.e., it federate data that can be in different data models including, relational. XML and RDF, sequence data and so on. It uses an ontologically enhanced system catalog, an ontological data index, an association index to facilitate cross-model data mapping, and a new algorithm for ontology-assisted keyword queries with ranking. The paper describes these components in detail, and presents an evaluation of the architecture in the context of an actual application. (C) 2010 Elsevier B.V. All rights reserved.; This paper presents BIODB, an ontology-enhanced information system to manage heterogeneous data. An ontology-enhanced system is a system where ad hoc data is imported into the system by a user, annotated by the user to connect the data to an ontology or other data sources, and then all data connected through the ontology can be queried in a federated manner. The BIODB system enables multi-model data federation, i.e., it federate data that can be in different data models including, relational, XML and RDF, sequence data and so on. It uses an ontologically enhanced system catalog, an ontological data index, an association index to facilitate cross-model data mapping, and a new algorithm for ontology-assisted keyword queries with ranking. The paper describes these components in detail, and presents an evaluation of the architecture in the context of an actual application.; This paper presents BIODB, an ontology-enhanced information system to manage heterogeneous data. An ontology-enhanced system is a system where ad hoc data is imported into the system by a user, annotated by the user to connect the data to an ontology or other data sources, and then all data connected through the ontology can be queried in a federated manner. The BIODB system enables multi-model data federation, i.e., it federate data that can be in different data models including, relational, XML and RDF, sequence data and so on. It uses an ontologically enhanced system catalog, an ontological data index, an association index to facilitate cross-model data mapping, and a new algorithm for ontology-assisted keyword queries with ranking. The paper describes these components in detail, and presents an evaluation of the architecture in the context of an actual application. © 2010 Elsevier B.V. All rights reserved.
BibTeX:
@article{gupta_biodb:_2010,
  author = {Gupta, Amarnath and Condit, Christopher and Qian, Xufei},
  title = {BioDB: An ontology-enhanced information system for heterogeneous biological information},
  journal = {Data & Knowledge Engineering},
  year = {2010},
  volume = {69},
  number = {11},
  pages = {1084--1102},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1NT9wwELUqxKESAkopbEslH-itYRM7TuzeWCjqvVTtzYrjsdgi7a7YrMTP70zsLB8CVHG1J4niGc2Mx2-eGZPiJM8e-QQArduihLpwhXY1yDpg4o25qhfKFxIeVrZja0wCWaZIED1877vTyDit7XgxnY5_EqsIxp8_RcReEQm3whEifjyfrH0zxq8IaqxMRtIDD1GP-CJM5nVCexGxoXwuViXnvUmNI6vlkyGrD08XO2zARA2wlPVZ9V03_ROw7df_9i7bTpksP41y79gbmO2xneGWCJ6cxnv2ezKdn0–8dMZJ64EquJnMLvqkQc88baSdfBIKs1xgF8RSGeOtg3z1ZJHoiiypvvy–zXxffLsx9Zus4ha4lmLgP0yb4JXpSqQVOoQtkG74L2uVIiyOBUGbzMncuhDRW0jdFKaG3ABagaaOUHttUQ7H_W9e2B_pBx1UqlXSmaMg8lgDGicsaj_8Z9rXMmH7GvgwbtItJ32AHX9tf2CrekcJvTMbwcsWrQsn2gAIsB5uUHD8gmLK1Cd9O0VkhT6FpKnPkSzWT9fWGXwua2ro2qcbOICZTtbjt8wyM54jqkhHHEju_b13q-r4xiOqk06V6MWPE_YmeJ-p0oD7qPr_3fT-xthFFQLeqIbXQ3K_gcmSz_AWteNqc}
}
Artmann, S. Biological Information 2008 , pp. 22-39  incollection  
BibTeX:
@incollection{artmann_biological_2008,
  author = {Artmann, Stefan},
  title = {Biological Information},
  year = {2008},
  pages = {22--39}
}
Galas, D.J., Nykter, M., Carter, G.W., Price, N.D. and & Schmulevich, F. Biological Information as Set Based Complexity 2010 IEEE Transactions on Information Theory
Vol. 56(2), pp. 667-677 
article URL 
Abstract: The significant and meaningful fraction of all the potential information residing in the molecules and structures of living systems is unknown. Sets of random molecular sequences or identically repeated sequences, for example, would be expected to contribute little or no useful information to a cell. This issue of quantitation of information is important since the ebb and flow of biologically significant information is essential to our quantitative understanding of biological function and evolution. Motivated specifically by these problems of biological information, a class of measures is proposed to quantify the contextual nature of the information in sets of objects, based on Kolmogorov's intrinsic complexity. Such measures discount both random and redundant information and are inherent in that they do not require a defined state space to quantify the information. The maximization of this new measure, which can be formulated in terms of the universal information distance, appears to have several useful and interesting properties, some of which we illustrate with examples.; The significant and meaningful fraction of all the potential information residing in the molecules and structures of living systems is unknown. Sets of random molecular sequences or identically repeated sequences, for example, would be expected to contribute little or no useful information to a cell. This issue of quantitation of information is important since the ebb and flow of biologically significant information is essential to our quantitative understanding of biological function and evolution. Motivated specifically by these problems of biological information, a class of measures is proposed to quantify the contextual nature of the information in sets of objects, based on Kolmogorov's intrinsic complexity. Such measures discount both random and redundant information and are inherent in that they do not require a defined state space to quantify the information. The maximization of this new measure, which can be formulated in terms of the universal information distance, appears to have several useful and interesting properties, some of which we illustrate with examples. [PUBLICATION ABSTRACT]; The significant and meaningful fraction of all the potential information residing in the molecules and structures of living systems is unknown. Sets of random molecular sequences or identically repeated sequences, for example, would be expected to contribute little or no useful information to a cell. This issue of quantitation of information is important since the ebb and flow of biologically significant information is essential to our quantitative understanding of biological function and evolution. Motivated specifically by these problems of biological information, a class of measures is proposed to quantify the contextual nature of the information in sets of objects, based on Kolmogorov's intrinsic complexity. Such measures discount both random and redundant information and are inherent in that they do not require a defined state space to quantify the information. The maximization of this new measure, which can be formulated in terms of the universal information distance, appears to have several useful and interesting properties, some of which we illustrate with examples.; The significant and meaningful fraction of all the potential information residing in the molecules and structures of living systems is unknown. Sets of random molecular sequences or identically repeated sequences, for example, would be expected to contribute little or no useful information to a cell. This issue of quantitation of information is important since the ebb and flow of biologically significant information is essential to our quantitative understanding of biological function and evolution. Motivated specifically by these problems of biological information, a class of measures is proposed to quantify the contextual nature of the information in sets of objects, based on Kolmogorov's intrinsic complexity. Such measures discount both random and redundant information and are inherent in that they do not require a defined state space to quantify the information. The maximization of this new measure, which can be formulated in terms of the universal information distance, appears to have several useful and interesting properties, some of which we illustrate with examples. [PUBLICATION ABSTRACT]
BibTeX:
@article{Galas2010,
  author = {Galas, D. J., Nykter, M., Carter, G. W., Price, N. D., & Schmulevich, F.},
  title = {Biological Information as Set Based Complexity},
  journal = {IEEE Transactions on Information Theory},
  year = {2010},
  volume = {56},
  number = {2},
  pages = {667-677},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lb9QwEB4hDgiEKC2QBlopBzhwSOpX1vERFqoiuLFI3CzHdk6rim5SCf49M3GSLgUJxC1WEufhsecbz8w3AFJUrLy1JpiuFSq0rUM87lqpUU37RnucXExqN-5k7O1sw1xlkeqdjKFosaLD0bOP7f7Mtb112-1IK-h2U_GMWqElb8h-51zzlMu1eBRUzRNzOMcJjjbJ7LJk5mzzYZOIKwW-DCMovKeipoV6LL3yKwqltJHr_o8Ka1RO5wcwO6rnoJTFU32TS_970PZ_ffRjeDQh2OJNErlDuBMvj-Bgrg5RTIvFETzYozp8AlUqekkiUUwpUCQSheuLz3Eo36IyDQV1QgSdw4-n8OX8_WZ9UU61Gkqv0IQpZSs77jpc-bmXwQlBpDLMeRGCkFE0HVMy-M4Z3fFoePBy1arQNa1Gg8rVWj6Dh45i-i-HMfcvZHB32F3HjHRihj80g3tfzad3zcXHdWoezs2qH_PTqqshw8Eep2-5qvQxFIGjgHpfy66RKnqBDzMympY1IaIZVefweh5o-y2Re9jRKGLGolBQiU5jJ6HIISNJsDTvh53zViBYQuzIRA6nNE5LD2wE3ivVcB6iiTGHk1lqLA4h3Tz0lkh9EAsqgx0vp8N2iycaskm15jm8SjK2dC1sLyyzWq8U4m78dcoO34ccjm9dJ2vEvabhKoeX-9J58460HVEzKahmLJM58H-5bD3RxhNdwvD8L5_9Au6nSAvarjpJo3mayC5_AthzNuc}
}
Sarkar, S. Biological Information: A Skeptical Look at Some Central Dogmas of Molecular Biology 1996 Molecular Models of Life: Philosophical Papers on Molecular Biology, pp. 205-260  incollection  
BibTeX:
@incollection{sarkar_biological_1996,
  author = {Sarkar, Sahotra},
  title = {Biological Information: A Skeptical Look at Some Central Dogmas of Molecular Biology},
  booktitle = {Molecular Models of Life: Philosophical Papers on Molecular Biology},
  publisher = {MIT Press},
  year = {1996},
  pages = {205--260}
}
Smolin, L. Black hole information paradox and relative locality 2014 Physical Review D
Vol. 90(2) 
article  
BibTeX:
@article{smolin_black_2014,
  author = {Smolin, Lee},
  title = {Black hole information paradox and relative locality},
  journal = {Physical Review D},
  year = {2014},
  volume = {90},
  number = {2}
}
Bekenstein, J.D. Black holes and information theory 2004 Contemporary Physics
Vol. 45(1), pp. 31-43 
article  
Abstract: During the past three decades investigators have unveiled a number of deep connections between physical information and black holes whose consequences for ordinary systems go beyond what has been deduced purely from the axioms of information theory. After a self-contained introduction to black hole thermodynamics, we review from its vantage point topics such as the information conundrum that emerges from the ability of incipient black holes to radiate, the various entropy bounds for non-black hole systems (holographic bound, universal entropy bound, etc.) which are most easily derived from black hole thermodynamics, Bousso's covariant entropy bound, the holographic principle of particle physics, and the subject of channel capacity of quantum communication channels.;During the past three decades investigators have unveiled a number of deep connections between physical information and black holes whose consequences for ordinary systems go beyond what has been deduced purely from the axioms of information theory. After a self-contained introduction to black hole thermodynamics, we review from its vantage point topics such as the information conundrum that emerges from the ability of incipient black holes to radiate, the various entropy bounds for non-black hole systems (holographic bound, universal entropy bound, etc.) which are most easily derived from black hole thermodynamics, Bousso's covariant entropy bound, the holographic principle of particle physics, and the subject of channel capacity of quantum communication channels. [PUBLICATION ABSTRACT];During the past three decades investigators have unveiled a number of deep connections between physical information and black holes whose consequences for ordinary systems go beyond what has been deduced purely from the axioms of information theory. After a self-contained introduction to black hole thermodynamics, we review from its vantage point topics such as the information conundrum that emerges from the ability of incipient black holes to radiate, the various entropy bounds for non-black hole systems (holographic bound, universal entropy bound, etc.) which are most easily derived from black hole thermodynamics, Bousso's covariant entropy bound, the holographic principle of particle physics, and the subject of channel capacity of quantum communication channels. [PUBLICATION ABSTRACT];During the past three decades investigators have unveiled a number of deep connections between physical information and black holes whose consequences for ordinary systems go beyond what has been deduced purely from the axioms of information theory. After a self-contained introduction to black hole thermodynamics, we review from its vantage point topics such as the information conundrum that emerges from the ability of incipient black holes to radiate, the various entropy bounds for non- black hole systems ( holographic bound, universal entropy bound, etc.) which are most easily derived from black hole thermodynamics, Bousso's covariant entropy bound, the holographic principle of particle physics, and the subject of channel capacity of quantum communication channels.;
BibTeX:
@article{bekenstein_black_2004,
  author = {Bekenstein, Jacob D.},
  title = {Black holes and information theory},
  journal = {Contemporary Physics},
  year = {2004},
  volume = {45},
  number = {1},
  pages = {31--43}
}
Barbón, J.L.F. Black holes, information and holography 2009 Journal of Physics: Conference Series
Vol. 171, pp. 012009 
article URL 
BibTeX:
@article{barbon_black_2009,
  author = {Barbón, J. L. F.},
  title = {Black holes, information and holography},
  journal = {Journal of Physics: Conference Series},
  year = {2009},
  volume = {171},
  pages = {012009},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8QwEB5EEATxLVsf0IsKYnc7faTZo4iLJ097D2mS4l66xXbBn–kaeu67kF7SmHa5tXMfJn5JgBxNA6DjTUhTKQ0iuuUS5siLie9ibnBSNts4NjGZq7tbEN_JuJiWXWKYExF59gnLBewdMqohBOcWPKnI_DRbGz5W2_9ShzTlTlCpHukZwgT6Nv-GlIw9B2bNVQtq1W9pmlmR9CHG_cRJoPb-ZsY_zsC-88tOIbDzgT1n9ycOYEdU57CXhsKquozuG_39Hx7bm796HeJVe3w-bLU_vuQ5Poc5rOX-fNr0B2nECwYI4xoCL0gakYWRcpUqJApAj-EaLBgOUeW5kpPs0iZgkwMQzCG59JkTBEoU0zG8QUcSBt1XzYtO0-PwOc5NSOUieYJ0s8f0Y3hhcwSQ2YSU6kHD9RqUbnEGaJ1eHMubA8I2wNUQoHC9YAHdz-EtwqJShce3LrxGWQjUUciFIzHyRQJZDEUzWfjwWhDjowrMrF4Gl7-p2JXsO98SHbv5Rp2m4-VuXFpG78A-WrLXg}
}
Lebowitz, J. Boltzmann Entropy and Time's Arrow 1993 Physics Today
Vol. 46, pp. 32-38 
article  
BibTeX:
@article{Lebowitz1993,
  author = {Lebowitz, J.L.},
  title = {Boltzmann Entropy and Time's Arrow},
  journal = {Physics Today},
  year = {1993},
  volume = {46},
  pages = {32-38}
}
Juan Yin Yuan Cao and et. al. , J.-W.P. Bounding the speed of `spooky action at a distance' 2013 School: University of Science and Technology of China, Shanghai 201315, China  techreport URL 
BibTeX:
@techreport{JuanYin2013,
  author = {Juan Yin, Yuan Cao, and Jian-Wei Pan et. al.},
  title = {Bounding the speed of `spooky action at a distance'},
  school = {University of Science and Technology of China, Shanghai 201315, China},
  year = {2013},
  url = {http://arxiv.org/pdf/1303.0614v1.pdf}
}
PFAFF, D.W. Brain Arousal and Information Theory: Neural and Genetic Mechanisms 2006   book  
BibTeX:
@book{pfaff_brain_2006,
  author = {PFAFF, Donald W.},
  title = {Brain Arousal and Information Theory: Neural and Genetic Mechanisms},
  publisher = {Harvard University Press},
  year = {2006}
}
Rolston, H. Care on Earth: generating informed concern 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 205-246  incollection  
BibTeX:
@incollection{rolston_care_2010,
  author = {Rolston, Holmes},
  title = {Care on Earth: generating informed concern},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {205--246},
  note = {DOI: 10.1017/CBO9780511778759.011}
}
Schiemer, G. Carnap's Early Semantics 2013 Erkenntnis: An International Journal of Analytic Philosophy
Vol. 78(3), pp. 487 
article URL 
Abstract: This paper concerns Carnap's early contributions to formal semantics in his work on general axiomatics between 1928 and 1936. Its main focus is on whether he held a variable domain conception of models. I argue that interpreting Carnap's account in terms of a fixed domain approach fails to describe his premodern understanding of formal models. By drawing attention to the second part of Carnap's unpublished manuscript Untersuchungen zur allgemeinen Axiomatik, an alternative interpretation of the notions 'model', 'model extension' and 'submodel' in his theory of axiomatics is presented. Specifically, it is shown that Carnap's early model theory is based on a convention to simulate domain variation that is not identical but logically comparable to the modern account.[PUBLICATION ABSTRACT];This paper concerns Carnap's early contributions to formal semantics in his work on general axiomatics between 1928 and 1936. Its main focus is on whether he held a variable domain conception of models. I argue that interpreting Carnap's account in terms of a fixed domain approach fails to describe his premodern understanding of formal models. By drawing attention to the second part of Carnap's unpublished manuscript Untersuchungen zur allgemeinen Axiomatik, an alternative interpretation of the notions 'model', 'model extension' and 'submodel' in his theory of axiomatics is presented. Specifically, it is shown that Carnap's early model theory is based on a convention to simulate domain variation that is not identical but logically comparable to the modern account.;
BibTeX:
@article{schiemer_carnaps_2013,
  author = {Schiemer, Georg},
  title = {Carnap's Early Semantics},
  journal = {Erkenntnis: An International Journal of Analytic Philosophy},
  year = {2013},
  volume = {78},
  number = {3},
  pages = {487},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV07TwMxDLagLF14Q3lJtzEFLrm8bkJQUTEiARLbKXVyCKm0pS0D_x7nHkWAujBGyRI78dv-ADJxkbJfMmFoPGakrUWKXAdnFK2dQ1IuaSkchp-R7bqpP0YKam63QrKS3H6CMWh-yTOpM0PqXV1N31mEkYrp1gZTYx02eJzsTQ_cPN8sRXPjgcUWHhadoTbNWffS6YjBwgXLY-WXXSWjK8Uz2II2e9QWnCyz0N998n8Lsv99oW3YbCzU5Lp-UjuwFsa70L1vIQ8-96DXd3Rkej5PqvHIyUN4I_684nwfnga3j_071iAssJc4yI3Rb9U8YMkVIpkG0udka2ul1JDMKKHQEJG88W4YIp6RIkvcq-BRltajQovZAXTGk3HoQSJEKKV2Mhchly511qCxHCWWqEqr_REcRnIW8d8sZg6LjBSjNFZZ2mnpUfjRqBC5IidI8Iwfr9w5ga6ocSmIY6fQWcw-wlk9SfELP7y1Hw}
}
Graham, D.J. Chemical thermodynamics and information theory with applications 2011   book URL 
BibTeX:
@book{graham_chemical_2011,
  author = {Graham, Daniel J.},
  title = {Chemical thermodynamics and information theory with applications},
  publisher = {CRC Press},
  year = {2011},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV3LCsIwEFx8IAoefOIT8gNKTeOmuQmi-AHexdrkph704t-7mzZgRaGXUkhoSLI7k50JQCyX0eJrT1A0kaxKjUWnEbXN1CpyzqwuhOUwclhmtn_hxpIAPTAYcaIJs8dVqBLI-1iVwdgbPVSnlMBQoEu0DB5P4b1kvucjyr4DNVYZdKFibz1obsO1az1o-JrMy6MPmyDmF5ykXe9Zfnn8QxD4F4XhKQ-r8GLEl2BKVXweSA9gvt8dt4cF934qiJpTWvxOLIfQPnOB–3phXDZCITElM_cdEY5k9IGjVNrqSzFFkzYtH0Mw9-NTf59mEIrJ0j5mUHd0Xy383xE3lIae7c}
}
Floridi, L. Children of the Fourth Revolution 2011 Philosophy & Technology
Vol. 24(3), pp. 227-232 
article URL 
Abstract: It is a well-known fact that artificial intelligence (AI) research seeks both to reproduce the outcome of our intelligent behaviour by non-biological means, and to produce the non-biological equivalent of our intelligence. The two souls of AI, the engineering and the cognitive one, have often engaged in fratricidal feuds for intellectual predominance, academic power, and financial resources. Perhaps because ACs are neither Asimovs robots nor Hals children, the philosophical questions they posit are very concrete. When is an informational artefact a companion?
BibTeX:
@article{floridi_children_2011,
  author = {Floridi, Luciano},
  title = {Children of the Fourth Revolution},
  journal = {Philosophy & Technology},
  year = {2011},
  volume = {24},
  number = {3},
  pages = {227--232},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlR1di9QwcBAFORH1zrNXP6AivggtadLm41FWlwN9kPMD30KapBy47K23PdF_byZtdr1dH_SxdBqmM5PMZD4BGK1IuXMmsFYKR5mjVgrTE-5r4QxR3nRWhbtdd92zDXTjyVh-q1KAMp7b29I3xhpMmgyXYawwwXryoKpQxs8-ftl4WYLIcRbnttJwtynbhrEU2vzbIteU0-4Rjc1E7cXqar0XNI26aH4fUo1MykHZBKa3pfP7Odr_8Y8P4N5kphavR7k6hBt-eQR3_mheeAQHH9IUhF8P4flsKgovLvoi2JTFPKwwnBdn_sck28fwef720-y0nKYvlLZuBC2DJaJqRzlzjvSk7klvHXHCG0OIqYWVUtHaO9owz7hVnei44qrjTrak6xhlj-CuwSz95RCr-VwGt_qwpXyGai4LRMvg9lf1_o08fTcbHw_TY7WOJWfV9yELWjXuyJJX4gQKL5nljWKm4aTpcK47jvA1DcWx6aI1ObxKHNSrsV-H3nZmRnrqQE-N9NQihwx5rHEvD5fGamzzJHlLVXiT2K7dYqGpQM8rXn5zeDlKwWZ5qtdUEy1j6xwmgoGkh59DDic7cMFUaGNgPOCY-L3FMaKG1pieWKsDhivX5_BiDziCTV80mgXIx_8E9QQORpc4psg9hZvD5ZV_Nvaf_A0v2gl1}
}
Adami, C. and Steeg, G. Classical information transmission capacity of quantum black holes 2014 CLASSICAL AND QUANTUM GRAVITY
Vol. 31(7) 
article  
Abstract: The fate of classical information incident on a quantum black hole has been the subject of an ongoing controversy in theoretical physics, because a calculation within the framework of semi-classical curved-space quantum field theory appears to show that the incident information is irretrievably lost, in contradiction to time-honored principles such as time-reversibility and unitarity. Here, we show within this framework embedded in quantum communication theory that signaling from past to future null infinity in the presence of a Schwarzschild black hole can occur with arbitrary accuracy, and thus that classical information is not lost in black hole dynamics. The calculation relies on a treatment that is manifestly unitary from the outset, where probability conservation is guaranteed because black holes stimulate the emission of radiation in response to infalling matter. This stimulated radiation is non-thermal and contains all of the information about the infalling matter, while Hawking radiation contains none of it.
BibTeX:
@article{adami_classical_2014,
  author = {Adami, C. and Steeg, GV},
  title = {Classical information transmission capacity of quantum black holes},
  journal = {CLASSICAL AND QUANTUM GRAVITY},
  year = {2014},
  volume = {31},
  number = {7}
}
Shannon, C.E., Sloane, N.J.A., Wyner, A.D. and Society, I.I.T. Claude Elwood Shannon: collected papers 1993   book URL 
BibTeX:
@book{shannon_claude_1993,
  author = {Shannon, Claude E. and Sloane, N. J. A. and Wyner, A. D. and Society, IEEE Information Theory},
  title = {Claude Elwood Shannon: collected papers},
  publisher = {IEEE Press},
  year = {1993},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV09C4MwED1qu7R06IfSL8E_YLGaxGQWpXu7S0IiHcSp_f9NooIWO4bAkcDlLu_l7gUgia9R-BMTKo6xSlQksca4LBLmd1mc8phWDCsk0ZjZnsKNowb0nsHQ-Ul7JHHA0SBvcCptGEZWusVCdWoeABPEOsWdfoxH4ns2oxQbmJsugy3MVLOD1UAVcA9uVvOPVEFem3qY4PHijcboLvhF_szuoTFVdqxLKbq1sdiDNTfV6s3bdrXJAwSqIhKJmIgbF4hQKvRFB-lkgrAQlLD0CN60sdO_iTMs21I8Qw5cYFFp51V-u70v5PBo0Q}
}
Boucheron, S., Garivier, A. and Gassiat, E. Coding on Countably Infinite Alphabets 2009 IEEE Transactions on Information Theory
Vol. 55(1), pp. 358-373 
article  
Abstract: This paper describes universal lossless coding strategies for compressing sources on countably infinite alphabets. Classes of memoryless sources defined by an envelope condition on the marginal distribution provide benchmarks for coding techniques originating from the theory of universal coding over finite alphabets. We prove general upper bounds on minimax regret and lower bounds on minimax redundancy for such source classes. The general upper bounds emphasize the role of the normalized maximum likelihood (NML) codes with respect to minimax regret in the infinite alphabet context. Lower bounds are derived by tailoring sharp bounds on the redundancy of Krichevsky-Trofimov coders for sources over finite alphabets. Up to logarithmic (resp., constant) factors the bounds are matching for source classes defined by algebraically declining (resp., exponentially vanishing) envelopes. Effective and (almost) adaptive coding techniques are described for the collection of source classes defined by algebraically vanishing envelopes. Those results extend our knowledge concerning universal coding to contexts where the key tools from parametric inference are known to fail. [PUBLICATION ABSTRACT];This paper describes universal lossless coding strategies for compressing sources on countably infinite alphabets. Classes of memoryless sources defined by an envelope condition on the marginal distribution provide benchmarks for coding techniques originating from the theory of universal coding over finite alphabets. We prove general upper bounds on minimax regret and lower bounds on minimax redundancy for such source classes. The general upper bounds emphasize the role of the normalized maximum likelihood (NML) codes with respect to minimax regret in the infinite alphabet context. Lower bounds are derived by tailoring sharp bounds on the redundancy of Krichevsky-Trofimov coders for sources over finite alphabets. Up to logarithmic (resp., constant) factors the bounds are matching for source classes defined by algebraically declining (resp., exponentially vanishing) envelopes. Effective and (almost) adaptive coding techniques are described for the collection of source classes defined by algebraically vanishing envelopes. Those results extend our knowledge concerning universal coding to contexts where the key tools from parametric inference are known to fail.;This paper describes universal lossless coding strategies for compressing sources on countably infinite alphabets. Classes of memoryless sources defined by an envelope condition on the marginal distribution provide benchmarks for coding techniques originating from the theory of universal coding over finite alphabets. We prove general upper bounds on minimax regret and lower bounds on minimax redundancy for such source classes. The general upper bounds emphasize the role of the normalized maximum likelihood (NML) codes with respect to minimax regret in the infinite alphabet context. Lower bounds are derived by tailoring sharp bounds on the redundancy of Krichevsky-Trofimov coders for sources over finite alphabets. Up to logarithmic (resp., constant) factors the bounds are matching for source classes defined by algebraically declining (resp., exponentially vanishing) envelopes. Effective and (almost) adaptive coding techniques are described for the collection of source classes defined by algebraically vanishing envelopes. Those results extend our knowledge concerning universal coding to contexts where the key tools from parametric inference are known to fail.; This paper describes universal lossless coding strategies for compressing sources on countably infinite alphabets. Classes of memoryless sources defined by an envelope condition on the marginal distribution provide benchmarks for coding techniques originating from the theory of universal coding over finite alphabets. We prove general upper bounds on minimax regret and lower bounds on minimax redundancy for such source classes. The general upper bounds emphasize the role of the normalized maximum likelihood (NML) codes with respect to minimax regret in the infinite alphabet context. Lower bounds are derived by tailoring sharp bounds on the redundancy of Krichevsky-Trofimov coders for sources over finite alphabets. Up to logarithmic (resp., constant) factors the bounds are matching for source classes defined by algebraically declining (resp., exponentially vanishing) envelopes. Effective and (almost) adaptive coding techniques are described for the collection of source classes defined by algebraically vanishing envelopes. Those results extend our knowledge concerning universal coding to contexts where the key tools from parametric inference are known to fail. [PUBLICATION ABSTRACT];
BibTeX:
@article{boucheron_coding_2009,
  author = {Boucheron, S. and Garivier, A. and Gassiat, E.},
  title = {Coding on Countably Infinite Alphabets},
  journal = {IEEE Transactions on Information Theory},
  year = {2009},
  volume = {55},
  number = {1},
  pages = {358--373}
}
Aydede, M. and Güzeldere, G. Cognitive Architecture, Concepts, and Introspection: An Information‐Theoretic Solution to the Problem of Phenomenal Consciousness 2005 Noûs
Vol. 39(2), pp. 197-255 
article  
BibTeX:
@article{aydede_cognitive_2005,
  author = {Aydede, Murat and Güzeldere, Güven},
  title = {Cognitive Architecture, Concepts, and Introspection: An Information‐Theoretic Solution to the Problem of Phenomenal Consciousness},
  journal = {Noûs},
  year = {2005},
  volume = {39},
  number = {2},
  pages = {197--255}
}
Maruyama, K., Nori, F. and Vedral, V. Colloquium: the physics of Maxwell's demon and information 2009 Reviews of Modern Physics
Vol. 81(1), pp. 1 
article URL 
Abstract: The history of Maxwell's demon and a number of interesting consequences of the second law of thermodynamics in quantum mechanics and the theory of gravity are reviewed. Maxwell introduced his demon in 1871 as a thought experiment to demonstrate that the second law is a statistical principle that almost always holds, rather than an absolute law. The demon is used as the starting point of a discussion of the erasure principle, before the implications of the second law for the wave nature of light, Gibbs' paradox and the quantum superposition principle, quantum state discrimination, linearity in quantum dynamics, general relativity, and the Einstein equation from thermodynamics are discussed. The erasure of classical information encoded in quantum states, the thermodynamic derivation of the Holevo bound, entanglement detection using the demon, and physical implementations of the demon are then explored.
BibTeX:
@article{maruyama_colloquium:_2009,
  author = {Maruyama, Koji and Nori, Franco and Vedral, Vlatko},
  title = {Colloquium: the physics of Maxwell's demon and information},
  journal = {Reviews of Modern Physics},
  year = {2009},
  volume = {81},
  number = {1},
  pages = {1},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V1bS8MwFA46EHwR73fIkz6MjjVN22Tgg4qyx-EuryXNRaZrC10H-u9Nm7TdBvsDvqakpPlOzy0n3wHAQ72-s6UT_L7LmaDa947D2BXaLknki7CUCKxEHGxltuvmm-3YfwC-TAVobT9fJXXNhsle2GqXH1MGHS67Qia2FNmypzYY1YTdhqm0mmdapo3Mm9o0dr76ZYm5VZZ9zZvMcmbur1ddO7J6dCZFXrUX6M4WrPjONhIOdCvh0JwkjWpBsvWl62rWw05ADMt6T7aaFfuGCKBWvaZby4aIGT3qtuapPpJ_GQ9t9LvJj11mb7zA99FDSZaeiDkvnmTqTMf7OhInYXWl6KMxzlq9GRpVu8JdlrhyLybH4MjGBfDZ4HkC9mR6Cg7sfp-BQYvqAGpMocUUZgpaTB-XsEIUakThGqLnYPr-NnkdOrbthfNZxueOtjiEc71fAaIsDqWrmPQYYsgLfRRTSQOkOOFYEUV8LLQPwrEMA4mZFIwp4l2ATpql8gpAhpHwaYCJq2jJrBd7SgfMShtbqv9cl16Dy_Lro3JRRc541GymflJvSCQWi0h7mTr4DShxb3bOuQWHrbzcgU6Rr-S9obf8Az-UR3c}
}
Rooij, R.V. Comparing questions and answers:A bit of logic, a bit of language, and some bits of information 2009
Vol. 5363, pp. 161-192 
inproceedings URL 
BibTeX:
@inproceedings{rooij_comparing_2009,
  author = {Rooij, Robert V.},
  title = {Comparing questions and answers:A bit of logic, a bit of language, and some bits of information},
  year = {2009},
  volume = {5363},
  pages = {161--192},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV3PT9swFLYQk9DQpA0GAcaEL-NSUtE4TZpJHLoCqjRASGOIW5SkiVRpJFCnEn8-7_nZTpSOO5dKjlPFcT6_X37vM2PC65-6HZnghZ4f4NEWaeCH6SgrhqCXBWjPtIiiQh9E10S2DdtAc-2df_iORfxf5XOlNwVuqlrlWvXMuQ12GYNNOS-zf0tVziJBcCDXseyt_HG8UHlExMbRIu7EUPvKzb_mleZgrVXqUdTG4oQOO4THKS2kcu-IJFaqM9rEeNxL5yonQYlilVJqr-iIqskzldVjjn1Ellva6stm66iiLSlKGu_da_yb0EbUCm0YD1eAf6TMpKAlGAVIcfCDSDDmJLgDpGMURH9qJPtQaOFJwnlAtO9azw_oDL4VFdLOGoGHu_j0yBVx2GhMkyXQUaQ2vVGA042U7OIY-dsfZ_OsPstL9-8fMBWQ7gg5T8-nNhQIcjUMRWNAIKcjbX7Ra2JJkpkGj0ijmmmxTFpEltwZMxLgZtXTUraMprsvbKcpJ-W3FqtbbC0vt9lng0uucbnNNq8th7D8ymKLGm5RwwEAXKPm55gDCnhVcIWZE57YtkbMibod8YI9ErtaeNlhR5cXd5OpS0OPn4h3JbbTKnbZpwQLQMpaFYrOHPahgEWbO2hBOfCaDtt4iK7OR9PfE2pumWZfqmrG_nPtwHyrNe8G_XCP8Zk_S7xhNgADOgUhlSRROPCzFAudwK3I0n32ozMgL5ZejKmR4Ob4Q_yN65d6n-29NfCDt7u-sY_NEjhk6_VimX8nbtBX_sSiYg}
}
Zurek, W.H. Complexity, entropy, and the physics of information: the proceedings of the 1988 Workshop on Complexity, Entropy, and the Physics of Information held May-June, 1989, in Santa Fe, New Mexico 1990
Vol. 8. 
inproceedings  
BibTeX:
@inproceedings{zurek_complexity_1990,
  author = {Zurek, Wojciech H.},
  title = {Complexity, entropy, and the physics of information: the proceedings of the 1988 Workshop on Complexity, Entropy, and the Physics of Information held May-June, 1989, in Santa Fe, New Mexico},
  publisher = {Addison-Wesley Pub. Co},
  year = {1990},
  volume = {8.}
}
Chaitin, G. Computation, Physics and Beyond, Lecture Notes in Computer Science 2012 (2007) , pp. 247-251  inbook  
BibTeX:
@inbook{Chaitin2012(2007),
  author = {Chaitin, G.},
  title = {Computation, Physics and Beyond, Lecture Notes in Computer Science},
  publisher = {Springer},
  year = {2012 (2007)},
  pages = {247-251}
}
Dodig Crnkovic, G. and Giovagnoli, R. Computing nature: Turing centenary perspective 2013
Vol. 7 
book URL 
BibTeX:
@book{dodig_crnkovic_computing_2013,
  author = {Dodig Crnkovic, Gordana and Giovagnoli, Raffaela},
  title = {Computing nature: Turing centenary perspective},
  publisher = {Springer},
  year = {2013},
  volume = {7},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwY2AwNtIz0EUrExINzI0SUyyTUwzNEw3NDFJM0oCRn5KcYp5ikpRiaIE2sg1ffAFbzAGf4UXsQcc8NTE0ONIFdWk1rmNubQ2NgNWYmSkzA7MFSm4GF98W5hZmZgagfSBmoDUjwDrOBIljagQ5tgcuaY7KB98rgDjRD1xNuQkysKaC9i4IMTCl5gkzCMBubFCAZmARmBCwulKAnOgpyiDr5hri7KELMiseOpYTn2RiAT7Dz9JYjIElLz8vVYJBwdgyJQ3Yb0lLMjW2NLE0SUxMNk1OTLJIMkpJTEszNzCUZBDDboYULglpBi4j8K0PoJEGGQaWkqLSVFmIrwBbk4C3}
}
Christensen, J. Conceptual frameworks of accounting from an information perspective 2010 Accounting and Business Research
Vol. 40(3), pp. 287-299 
article  
Abstract: This paper analyses the benefits of accounting regulation and a conceptual framework using an information economics approach that allows consideration of uncertainty, multiple agents, demand for information, and multiple information sources. It also allows private information to enter the analysis. The analysis leads to a set of fundamental properties of accounting information. It is argued that the set of qualitative characteristics typically contained in conceptual frameworks does not adequately aggregate the information demands of users of accounting information. For example, the IASB's conceptual framework contains no guidelines for the trade-off between relevance and reliability. Furthermore, neutrality might not be part of an optimal regulation. The statistical bias introduced by the stewardship use of accounting information is not necessarily undesirable and will always remain; stewardship is the characteristic of accounting information that provides incentives for management to act in the desired way. Accounting information is inherently late compared to other information sources but influences and constrains the content of more timely sources. The accounting system does not exist in a vacuum. Other information sources are present and the purpose of the accounting system cannot be analysed without considering the existence of other information sources. Finally, financial statements are audited by an independent auditor. This implies that accounting data are hard to manipulate.;This paper analyses the benefits of accounting regulation and a conceptual framework using an information economics approach that allows consideration of uncertainty, multiple agents, demand for information, and multiple information sources. It also allows private information to enter the analysis. The analysis leads to a set of fundamental properties of accounting information. It is argued that the set of qualitative characteristics typically contained in conceptual frameworks does not adequately aggregate the information demands of users of accounting information. For example, the IASB's conceptual framework contains no guidelines for the trade-off between relevance and reliability. Furthermore, neutrality might not be part of an optimal regulation. The statistical bias introduced by the stewardship use of accounting information is not necessarily undesirable and will always remain; stewardship is the characteristic of accounting information that provides incentives for management to act in the desired way. Accounting information is inherently late compared to other information sources but influences and constrains the content of more timely sources. The accounting system does not exist in a vacuum. Other information sources are present and the purpose of the accounting system cannot be analysed without considering the existence of other information sources. Finally, financial statements are audited by an independent auditor. This implies that accounting data are hard to manipulate.;This paper analyses the benefits of accounting regulation and a conceptual framework using an information economics approach that allows consideration of uncertainty, multiple agents, demand for information, and multiple information sources. It also allows private information to enter the analysis. The analysis leads to a set of fundamental properties of accounting information. It is argued that the set of qualitative characteristics typically contained in conceptual frameworks does not adequately aggregate the information demands of users of accounting information. For example, the IASB's conceptual framework contains no guidelines for the trade-off between relevance and reliability. Furthermore, neutrality might not be part of an optimal regulation. The statistical bias introduced by the stewardship use of accounting information is not necessarily undesirable and will always remain; stewardship is the characteristic of accounting information that provides incentives for management to act in the desired way. Accounting information is inherently late compared to other information sources but influences and constrains the content of more timely sources. The accounting system does not exist in a vacuum. Other information sources are present and the purpose of the accounting system cannot be analysed without considering the existence of other information sources. Finally, financial statements are audited by an independent auditor. This implies that accounting data are hard to manipulate. [PUBLICATION ABSTRACT];
BibTeX:
@article{christensen_conceptual_2010,
  author = {Christensen, John},
  title = {Conceptual frameworks of accounting from an information perspective},
  journal = {Accounting and Business Research},
  year = {2010},
  volume = {40},
  number = {3},
  pages = {287--299}
}
Shea, N. Consumers need information: supplementing teleosemantics with an input condition 2007 Philosophy and Phenomenology
Vol. 75(2), pp. 404–435 
article URL 
Abstract: The success of a piece of behaviour is often explained by its being caused by a true representation (similarly, failure falsity). In some simple organisms, success is just survival and reproduction. Scientists explain why a piece of behaviour helped the organism to survive and reproduce by adverting to the behaviour's having been caused by a true representation. That usage should, if possible, be vindicated by an adequate naturalistic theory of content. Teleosemantics cannot do so, when it is applied to simple representing systems (Godfrey-Smith 1996). Here it is argued that the teleosemantic approach to content should therefore be modified, not abandoned, at least for simple representing systems. The new 'infotel-semantics' adds an input condition to the output condition offered by teleosemantics, recognising that it is constitutive of content in a simple representing system that the tokening of a representation should correlate probabilistically with the obtaining of its specific evolutionary success condition.; The success of a piece of behaviour is often explained by its being caused by a true representation (similarly, failure falsity). In some simple organisms, success is just survival and reproduction. Scientists explain why a piece of behaviour helped the organism to survive and reproduce by adverting to the behaviour’s having been caused by a true representation. That usage should, if possible, be vindicated by an adequate naturalistic theory of content. Teleosemantics cannot do so, when it is applied to simple representing systems (Godfrey‐Smith 1996). Here it is argued that the teleosemantic approach to content should therefore be modified, not abandoned, at least for simple representing systems. The new ‘infotel‐semantics’ adds an input condition to the output condition offered by teleosemantics, recognising that it is constitutive of content in a simple representing system that the tokening of a representation should correlate probabilistically with the obtaining of its specific evolutionary success condition.; The success of a piece of behaviour is often explained by its being caused by a true representation (similarly, failure falsity). In some simple organisms, success is just survival and reproduction. Scientists explain why a piece of behaviour helped the organism to survive and reproduce by adverting to the behaviour's having been caused by a true representation. That usage should, if possible, be vindicated by an adequate naturalistic theory of content. Teleosemantics cannot do so, when it is applied to simple representing systems (Godfrey-Smith 1996). Here it is argued that the teleosemantic approach to content should therefore be modified, not abandoned, at least for simple representing systems. The new 'infotel-semantics' adds an input condition to the output condition offered by teleosemantics, recognising that it is constitutive of content in a simple representing system that the tokening of a representation should correlate probabilistically with the obtaining of its specific evolutionary success condition.
BibTeX:
@article{Shea2007,
  author = {Shea, N.},
  title = {Consumers need information: supplementing teleosemantics with an input condition},
  journal = {Philosophy and Phenomenology},
  year = {2007},
  volume = {75},
  number = {2},
  pages = {404–435},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1La9tAEB5KAsUQ2qZNVPUBe-hV7j70WOUSgttgaCmmpNCbWO2uIJCkTqRA–87s3rUDgmkNwvb8lozu983s7PfACg558mdNQEtTdJQhtvUpbKR0nCnLS9rX0qdebud2Z4yGVRkGaoEw54-0qX6wn8kr8MZmh2vrxPqHkW7rEMrDVyKBTKSXid3WpALLgZ1RpEg4GVbEDQsxH0x4h302SasAXFOn8NY4jZWmkzbz_8OyN9Tif2f_-QFPBsYKTvpXWgfnvirlzBbjS0O_ryC1WI4p9mybwh2bDjCRCY9YqEtaMgxIgiyM0SxX62_RHud25ZRlpeZK_zG-rZjeBsXKsQO4Mfp57PFMhk6MSQWI0CMVptCKVU3RFAwBKqRJDiufZo3RuomS63mGfdlKb0WaVHkVjjts9zItFbOaq0OYc9QxT79OHJoF8Fug9PLRwR5ET7aCJ7-LL9-0ssvi_5yf7yct-H42fy6i9Cg4eEk-bx4DQy9JzeFk6FZiBWpaZTzJOYjiMakJgYx2rla99od1UbMUypVIZmT1IkzbMJrWf2OISKHqGh6dzfGVqIQiO6iLGM4DAacbjVaL4YPm04zvc97fTgMSAMxx9E85mOLQZqdJAm6GPLggI8efrVarr7jqzcPDfYtzPrsNFXJvYOd7ubWv-8lKP8CBMYWLQ}
}
Kolmogorov, A.N. Contemporary Debates on the Nature of Mathematics 1929 Problemy Peredachi Informatsi (Problems of Information Transmission)
Vol. 42, pp. 129-141 
article  
BibTeX:
@article{Kolmogorov1929,
  author = {Kolmogorov, A. N.},
  title = {Contemporary Debates on the Nature of Mathematics},
  journal = {Problemy Peredachi Informatsi (Problems of Information Transmission)},
  year = {1929},
  volume = {42},
  pages = {129-141}
}
Sterelny, K. Content, Control and Display: The Natural Origins of Content 2015 Philosophia
Vol. 43(3), pp. 549-564 
article  
Abstract: Hutto and Satne identify three research traditions attempting to explain the place of intentional agency in a wholly natural world: naturalistic reduction; sophisticated behaviourism, and pragmatism, and suggest that insights from all three are necessary. While agreeing with that general approach, I develop a somewhat different package, offering an outline of a vindicating genealogy of our interpretative practices. I suggest that these practices had their original foundation in the elaboration of much more complex representation-guided control structures in our lineage and the support and amplification of those control structures through external resources. Cranes (as Dennett calls them) became increasingly important in the explanation of systematically successful action. These more complex representational engines coevolved with selection to detect and respond to the control structures of others. Since much of that selection was driven by the advantages of cooperation and coordination, in part these control structures were co-opted as external signals and guarantees, in cooperation and coordination. As the time depth of cooperation and co-ordination extended, these public signals of belief and intent acquired secondary functions, as mechanisms to stabilise and structure control systems, making humans not just more transparent to one another at a time, but more predictable over time. Mindshaping, not just mindreading, became increasingly important in our lineage.;Hutto and Satne identify three research traditions attempting to explain the place of intentional agency in a wholly natural world: naturalistic reduction; sophisticated behaviourism, and pragmatism, and suggest that insights from all three are necessary. While agreeing with that general approach, I develop a somewhat different package, offering an outline of a vindicating genealogy of our interpretative practices. I suggest that these practices had their original foundation in the elaboration of much more complex representation-guided control structures in our lineage and the support and amplification of those control structures through external resources. Cranes (as Dennett calls them) became increasingly important in the explanation of systematically successful action. These more complex representational engines coevolved with selection to detect and respond to the control structures of others. Since much of that selection was driven by the advantages of cooperation and coordination, in part these control structures were co-opted as external signals and guarantees, in cooperation and coordination. As the time depth of cooperation and co-ordination extended, these public signals of belief and intent acquired secondary functions, as mechanisms to stabilise and structure control systems, making humans not just more transparent to one another at a time, but more predictable over time. Mindshaping, not just mindreading, became increasingly important in our lineage.;
BibTeX:
@article{sterelny_content_2015,
  author = {Sterelny, Kim},
  title = {Content, Control and Display: The Natural Origins of Content},
  journal = {Philosophia},
  year = {2015},
  volume = {43},
  number = {3},
  pages = {549--564}
}
Morsing, M. and Schultz, M. Corporate social responsibility communication: stakeholder information, response and involvement strategies: 1 2006 Business Ethics
Vol. 15(4), pp. 323 
article  
Abstract: While it is generally agreed that companies need to manage their relationships with their stakeholders, the way in which they choose to do so varies considerably. In this paper, it is argued that when companies want to communicate with stakeholders about their CSR initiatives, they need to involve those stakeholders in a two-way communication process, defined as an ongoing iterative sense-giving and sense-making process. The paper also argues that companies need to communicate through carefully crafted and increasingly sophisticated processes. Three CSR communication strategies are developed. Based on empirical illustrations and prior research, the authors argue that managers need to move from 'informing' and 'responding' to 'involving' stakeholders in CSR communication itself. They conclude that managers need to expand the role of stakeholders in corporate CSR communication processes if they want to improve their efforts to build legitimacy, a positive reputation and lasting stakeholder relationships. [PUBLICATION ABSTRACT]
BibTeX:
@article{morsing_corporate_2006,
  author = {Morsing, Mette and Schultz, Majken},
  title = {Corporate social responsibility communication: stakeholder information, response and involvement strategies: 1},
  journal = {Business Ethics},
  year = {2006},
  volume = {15},
  number = {4},
  pages = {323}
}
Seising, R. Cybernetics, system(s) theory, information theory and Fuzzy Sets and Systems in the 1950s and 1960s 2010 Information Sciences
Vol. 180(23), pp. 4459-4476 
article  
BibTeX:
@article{seising_cybernetics_2010,
  author = {Seising, Rudolf},
  title = {Cybernetics, system(s) theory, information theory and Fuzzy Sets and Systems in the 1950s and 1960s},
  journal = {Information Sciences},
  year = {2010},
  volume = {180},
  number = {23},
  pages = {4459--4476}
}
Wiener, N. Cybernetics: or Control and Communication in the Animal and the Machine 1962   book URL 
BibTeX:
@book{Wiener1962,
  author = {Wiener, N.},
  title = {Cybernetics: or Control and Communication in the Animal and the Machine},
  publisher = {Massachussetts Instutite of Technology},
  year = {1962},
  edition = {2nd},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1Na8MwDBXrBmNj0HVbvY8W-gcSHDuJ63O2UtiOPfRWotg5Brb2sn8_2bGhKd1RCBxhpDxL-D0DSJHy5OSfYJcG80Y1KDjSkaKmc3ODspaFbAhClRpOts_1jQMCepxgZJwWy_UIRtTkHVUlJRPhJIEh952YKAkEnbBNMJxGu47yO0fOaHu_HijzebhZ3cOloyBM4MJ2DzCODy8sQh0-wm31i_anc_zD_RPMVx-bap24VXZhGrPDEHMupnBXu1vs3cGz3QyDq5ZSzjIHA4y-yOB6q7_el-vPqjcn0Uz3npKVfh8YoY7P2KRM1TMsjHHqNqXhaNq8QOpmijqTrUCRGaVt-QLT8-G8_ud4gxuq0qwfO8xCjPN-b_4Agft7_w}
}
Liu, M., Hua, Q.-x., Hu, S.-Q., Jia, W., Yang, Y., Saith, S.E., Whittaker, J., Arvan, P. and Weiss, M.A. Deciphering the hidden informational content of protein sequences: foldability of proinsulin hinges on a flexible arm that is dispensable in the mature hormone 2010 The Journal of biological chemistry
Vol. 285(40), pp. 30989-31001 
article URL 
Abstract: Protein sequences encode both structure and foldability. Whereas the interrelationship of sequence and structure has been extensively investigated, the origins of folding efficiency are enigmatic. We demonstrate that the folding of proinsulin requires a flexible N-terminal hydrophobic residue that is dispensable for the structure, activity, and stability of the mature hormone. This residue (Phe(B1) in placental mammals) is variably positioned within crystal structures and exhibits (1)H NMR motional narrowing in solution. Despite such flexibility, its deletion impaired insulin chain combination and led in cell culture to formation of non-native disulfide isomers with impaired secretion of the variant proinsulin. Cellular folding and secretion were maintained by hydrophobic substitutions at B1 but markedly perturbed by polar or charged side chains. We propose that, during folding, a hydrophobic side chain at B1 anchors transient long-range interactions by a flexible N-terminal arm (residues B1-B8) to mediate kinetic or thermodynamic partitioning among disulfide intermediates. Evidence for the overall contribution of the arm to folding was obtained by alanine scanning mutagenesis. Together, our findings demonstrate that efficient folding of proinsulin requires N-terminal sequences that are dispensable in the native state. Such arm-dependent folding can be abrogated by mutations associated with β-cell dysfunction and neonatal diabetes mellitus.;Protein sequences encode both structure and foldability. Whereas the interrelationship of sequence and structure has been extensively investigated, the origins of folding efficiency are enigmatic. We demonstrate that the folding of proinsulin requires a flexible N-terminal hydrophobic residue that is dispensable for the structure, activity, and stability of the mature hormone. This residue (Phe B1 in placental mammals) is variably positioned within crystal structures and exhibits 1 H NMR motional narrowing in solution. Despite such flexibility, its deletion impaired insulin chain combination and led in cell culture to formation of non-native disulfide isomers with impaired secretion of the variant proinsulin. Cellular folding and secretion were maintained by hydrophobic substitutions at B1 but markedly perturbed by polar or charged side chains. We propose that, during folding, a hydrophobic side chain at B1 anchors transient long-range interactions by a flexible N-terminal arm (residues B1–B8) to mediate kinetic or thermodynamic partitioning among disulfide intermediates. Evidence for the overall contribution of the arm to folding was obtained by alanine scanning mutagenesis. Together, our findings demonstrate that efficient folding of proinsulin requires N-terminal sequences that are dispensable in the native state. Such arm-dependent folding can be abrogated by mutations associated with β-cell dysfunction and neonatal diabetes mellitus.;
BibTeX:
@article{liu_deciphering_2010,
  author = {Liu, Ming and Hua, Qing-xin and Hu, Shi-Quan and Jia, Wenhua and Yang, Yanwu and Saith, Sunil E. and Whittaker, Jonathan and Arvan, Peter and Weiss, Michael A.},
  title = {Deciphering the hidden informational content of protein sequences: foldability of proinsulin hinges on a flexible arm that is dispensable in the mature hormone},
  journal = {The Journal of biological chemistry},
  year = {2010},
  volume = {285},
  number = {40},
  pages = {30989--31001},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV27TsMwFLVQF1gQtDzKQ_JUsSQkjvPw0KEqVF2QkIA5sh0biqhbVWGAr8fXTkrDyJrYinxi-R5fH5-LUELCKPizJuSc0TSx1FXJxHJynQkWZapgLkZol9_fyWyjVkUGIsuOVDE0izcnt2xw3bh6a6GGfSM8qsbEfihl0aherT7G66Xrp75HvqEzIuICkgiytus2HAu1O_x29aZNlT2QLpC0aO2Acnr7LmT4EPuaQRmFmjfghZ4UrmrLb1DbRrKuynInbM2O0GHDN_HED-QY7SnTR4OJsXvt5RceYacAdan1PtqfttXfBmhyp-Ri7S4HvmJLEvEcvEYMbq4v-Qwidt5WpsYrjR_B8GFh8FOrzT5BL7P75-k8aMotBNKOgwZcANkQEWeWFaW6YjRTgsWqIlonsC-pipzwjAi4y0pUXAkpZEZyxQiRsoiSU9QzK6POESaaUxUDe6GKCst5UlFx-58l5zFPpRqimxavcu1dNUp3Gp7T0qJcAsqlR3mIzjyM24Yt6EOUdwDeNgDH7O4bO2ucc3YzOS7-3fMSHTj5gFPzXaFevflU197B8QfyauN_}
}
Vedral, V. Decoding reality: the universe as quantum information 2010   book  
BibTeX:
@book{vedral_decoding_2010,
  author = {Vedral, Vlatko},
  title = {Decoding reality: the universe as quantum information},
  publisher = {Oxford University Press},
  year = {2010}
}
Clifford, J. and Adami, C. Discovery and information-theoretic characterization of transcription factor binding sites that act cooperatively 2015 PHYSICAL BIOLOGY
Vol. 12(5), pp. 056004 
article  
Abstract: Transcription factor binding to the surface of DNA regulatory regions is one of the primary causes of regulating gene expression levels. A probabilistic approach to model protein-DNA interactions at the sequence level is through Position Weight Matrices (PWMs) that estimate the joint probability of a DNA binding site sequence by assuming positional independence within the DNA sequence. Here we construct conditional PWMs that depend on the motif signatures in the flanking DNA sequence, by conditioning known binding site loci on the presence or absence of additional binding sites in the flanking sequence of each site's locus. Pooling known sites with similar flanking sequence patterns allows for the estimation of the conditional distribution function over the binding site sequences. We apply our model to the Dorsal transcription factor binding sites active in patterning the Dorsal-Ventral axis of Drosophila development. We find that those binding sites that cooperate with nearby Twist sites on average contain about 0.5 bits of information about the presence of Twist transcription factor binding sites in the flanking sequence. We also find that Dorsal binding site detectors conditioned on flanking sequence information make better predictions about what is a Dorsal site relative to background DNA than detection without information about flanking sequence features.;Transcription factor binding to the surface of DNA regulatory regions is one of the primary causes of regulating gene expression levels. A probabilistic approach to model protein-DNA interactions at the sequence level is through position weight matrices (PWMs) that estimate the joint probability of a DNA binding site sequence by assuming positional independence within the DNA sequence. Here we construct conditional PWMs that depend on the motif signatures in the flanking DNA sequence, by conditioning known binding site loci on the presence or absence of additional binding sites in the flanking sequence of each site's locus. Pooling known sites with similar flanking sequence patterns allows for the estimation of the conditional distribution function over the binding site sequences. We apply our model to the Dorsal transcription factor binding sites active in patterning the Dorsal-Ventral axis of Drosophila development. We find that those binding sites that cooperate with nearby Twist sites on average contain about 0.5 bits of information about the presence of Twist transcription factor binding sites in the flanking sequence. We also find that Dorsal binding site detectors conditioned on flanking sequence information make better predictions about what is a Dorsal site relative to background DNA than detection without information about flanking sequence features.;Transcription factor binding to the surface of DNA regulatory regions is one of the primary causes of regulating gene expression levels. A probabilistic approach to model protein-DNA interactions at the sequence level is through position weight matrices (PWMs) that estimate the joint probability of a DNA binding site sequence by assuming positional independence within the DNA sequence. Here we construct conditional PWMs that depend on the motif signatures in the flanking DNA sequence, by conditioning known binding site loci on the presence or absence of additional binding sites in the flanking sequence of each site's locus. Pooling known sites with similar flanking sequence patterns allows for the estimation of the conditional distribution function over the binding site sequences. We apply our model to the Dorsal transcription factor binding sites active in patterning the Dorsal-Ventral axis of Drosophila development. We find that those binding sites that cooperate with nearby Twist sites on average contain about 0.5 bits of information about the presence of Twist transcription factor binding sites in the flanking sequence. We also find that Dorsal binding site detectors conditioned on flanking sequence information make better predictions about what is a Dorsal site relative to background DNA than detection without information about flanking sequence features.;
BibTeX:
@article{clifford_discovery_2015,
  author = {Clifford, J. and Adami, C.},
  title = {Discovery and information-theoretic characterization of transcription factor binding sites that act cooperatively},
  journal = {PHYSICAL BIOLOGY},
  year = {2015},
  volume = {12},
  number = {5},
  pages = {056004}
}
Lyakhov, I.G., Krishnamachari, A. and Schneider, T.D. Discovery of novel tumor suppressor p53 response elements using information theory 2008 Nucleic acids research
Vol. 36(11), pp. 3828-3833 
article URL 
Abstract: An accurate method for locating genes under tumor suppressor p53 control that is based on a well-established mathematical theory and built using naturally occurring, experimentally proven p53 sites is essential in understanding the complete p53 network. We used a molecular information theory approach to create a flexible model for p53 binding. By searching around transcription start sites in human chromosomes 1 and 2, we predicted 16 novel p53 binding sites and experimentally demonstrated that 15 of the 16 (94 sites were bound by p53. Some were also bound by the related proteins p63 and p73. Thirteen of the adjacent genes were controlled by at least one of the proteins. Eleven of the 16 sites (69 had not been identified previously. This molecular information theory approach can be extended to any genetic system to predict new sites for DNA-binding proteins.; An accurate method for locating genes under tumor suppressor p53 control that is based on a well-established mathematical theory and built using naturally occurring, experimentally proven p53 sites is essential in understanding the complete p53 network. We used a molecular information theory approach to create a flexible model for p53 binding. By searching around transcription start sites in human chromosomes 1 and 2, we predicted 16 novel p53 binding sites and experimentally demonstrated that 15 of the 16 (94 sites were bound by p53. Some were also bound by the related proteins p63 and p73. Thirteen of the adjacent genes were controlled by at least one of the proteins. Eleven of the 16 sites (69 had not been identified previously. This molecular information theory approach can be extended to any genetic system to predict new sites for DNA-binding proteins.; An accurate method for locating genes under tumor suppressor p53 control that is based on a well-established mathematical theory and built using naturally occurring, experimentally proven p53 sites is essential in understanding the complete p53 network. We used a molecular information theory approach to create a flexible model for p53 binding. By searching around transcription start sites in human chromosomes 1 and 2, we predicted 16 novel p53 binding sites and experimentally demonstrated that 15 of the 16 (94 sites were bound by p53. Some were also bound by the related proteins p63 and p73. Thirteen of the adjacent genes were controlled by at least one of the proteins. Eleven of the 16 sites (69 had not been identified previously. This molecular information theory approach can be extended to any genetic system to predict new sites for DNA-binding proteins.; An accurate method for locating genes under tumor suppressor p53 control that is based on a well-established mathematical theory and built using naturally occurring, experimentally proven p53 sites is essential in understanding the complete p53 network. We used a molecular information theory approach to create a flexible model for p53 binding. By searching around transcription start sites in human chromosomes 1 and 2, we predicted 16 novel p53 binding sites and experimentally demonstrated that 15 of the 16 (94 sites were bound by p53. Some were also bound by the related proteins p63 and p73. Thirteen of the adjacent genes were controlled by at least one of the proteins. Eleven of the 16 sites (69 had not been identified previously. This molecular information theory approach can be extended to any genetic system to predict new sites for DNA-binding proteins.; An accurate method for locating genes under tumor suppressor p53 control that is based on a well-established mathematical theory and built using naturally occurring, experimentally proven p53 sites is essential in understanding the complete p53 network. We used a molecular information theory approach to create a flexible model for p53 binding. By searching around transcription start sites in human chromosomes 1 and 2, we predicted 16 novel p53 binding sites and experimentally demonstrated that 15 of the 16 (94 sites were bound by p53. Some were also bound by the related proteins p63 and p73. Thirteen of the adjacent genes were controlled by at least one of the proteins. Eleven of the 16 sites (69 had not been identified previously. This molecular information theory approach can be extended to any genetic system to predict new sites for DNA-binding proteins.
BibTeX:
@article{lyakhov_discovery_2008,
  author = {Lyakhov, Ilya G. and Krishnamachari, Annangarachari and Schneider, Thomas D.},
  title = {Discovery of novel tumor suppressor p53 response elements using information theory},
  journal = {Nucleic acids research},
  year = {2008},
  volume = {36},
  number = {11},
  pages = {3828--3833},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3fa9swED7KGHQwStduq9euCLblLWli-Yf00IeStvR5bM9ClqWuLHGC40DTv353kp3VXRljb050Mkgn6z5Jn74D4PFoPHwyJ3BRYKRCeO-kT4gkcBpMeIFY1ZhY6ri_sw0di4xIlj2q4qi6–Hplm2_1j7f2sjRupH-Ks9jn1NrPGgWi9n5cu7r2YdBMPRCRLqgTQTT4LxNemndCr89ecCAFiSnvEJnIjpJU8nPKl2f3f6sJsJLjQpcWuRp0otnvTtyj6Dqn4zLl3TpZL16Ntz50Ha9D3stJmUXobFvYMdWB3B4UeF6fL5hA-ZZon77_QB2p12GuEP4enmH78dvYcMWjlX4NGPNer6o2Wq99ARbfFymnNWBiGuZDWT1FSO6_S1rVVtpbDB_o3LzFr5fX32b3gzbHA1Dk_JEDrW11jnEiSQK5IqJyxAAi7w0qcEx6lJe4ic-ziWiLAQb2Lkuk0XsrJRYnMucv4PXmrj8VePv_JVHwIw2AueaTCN0S-Ik0_iLCyOMwxdqm0bwqetytQyaHCqcpXOFPlLBRxGcojf-avC-c5QqZzOF0BjBKccGUIl31-_arbcjyHuO3BqQSne_BEeqV-tuB2QEX4LLt1VitYrVWCUZp8NX0l1TzX0TwdETO0qhOKFExRF8fjxYtgYU1VLEGDj10iFrBJN_MZu2GvCkfdB8-O92HcOrQKGhjakTeNHUa_sxaFr-Ak4rPqI}
}
Goldman, A. Discrimination and Perceptual Knowledge 1976 The Journal of Philosophy
Vol. 73, pp. 771-791 
article  
BibTeX:
@article{Goldman1976,
  author = {Goldman, A.},
  title = {Discrimination and Perceptual Knowledge},
  journal = {The Journal of Philosophy},
  year = {1976},
  volume = {73},
  pages = {771-791}
}
Gray, R.M. Distortion and Approximation 2011 Entropy and Information Theory, pp. 117-146  incollection  
BibTeX:
@incollection{gray_distortion_2011,
  author = {Gray, Robert M.},
  title = {Distortion and Approximation},
  booktitle = {Entropy and Information Theory},
  publisher = {Springer US},
  year = {2011},
  pages = {117--146}
}
Gray, R.M. Distortion and Entropy 2011 Entropy and Information Theory, pp. 147-171  incollection  
BibTeX:
@incollection{gray_distortion_2011-1,
  author = {Gray, Robert M.},
  title = {Distortion and Entropy},
  booktitle = {Entropy and Information Theory},
  publisher = {Springer US},
  year = {2011},
  pages = {147--171}
}
Gray, R.M. Distortion and Information 2011 Entropy and Information Theory, pp. 237-263  incollection  
BibTeX:
@incollection{gray_distortion_2011-2,
  author = {Gray, Robert M.},
  title = {Distortion and Information},
  booktitle = {Entropy and Information Theory},
  publisher = {Springer US},
  year = {2011},
  pages = {237--263}
}
Stegmann, U.E. DNA, Inference, and Information 2009 The British Journal for the Philosophy of Science
Vol. 60(1), pp. 1-17 
article URL 
Abstract: This paper assesses Sarkar's ([2003]) deflationary account of genetic information. On Sarkar's account, genes carry information about proteins because protein synthesis exemplifies what Sarkar calls a 'formal information system'. Furthermore, genes are informationally privileged over non-genetic factors of development because only genes enter into arbitrary relations to their products (in virtue of the alleged arbitrariness of the genetic code). I argue that the deflationary theory does not capture four essential features of the ordinary concept of genetic information: intentionality, exclusiveness, asymmetry, and causal relevance. It is therefore further removed from what is customarily meant by genetic information than Sarkar admits. Moreover, I argue that it is questionable whether the account succeeds in demonstrating that information is theoretically useful in molecular genetics. [PUBLICATION ABSTRACT]; This paper assesses Sarkar's ([2003]) deflationary account of genetic information. On Sarkar's account, genes carry information about proteins because protein synthesis exemplifies what Sarkar calls a 'formal information system'. Furthermore, genes are informationally privileged over non-genetic factors of development because only genes enter into arbitrary relations to their products (in virtue of the alleged arbitrariness of the genetic code). I argue that the deflationary theory does not capture four essential features of the ordinary concept of genetic information: intentionality, exclusiveness, asymmetry, and causal relevance. It is therefore further removed from what is customarily meant by genetic information than Sarkar admits. Moreover, I argue that it is questionable whether the account succeeds in demonstrating that information is theoretically useful in molecular genetics. [PUBLICATION ABSTRACT];This paper assesses Sarkar's ([2003]) deflationary account of genetic information. On Sarkar's account, genes carry information about proteins because protein synthesis exemplifies what Sarkar calls a 'formal information system'. Furthermore, genes are informationally privileged over non-genetic factors of development because only genes enter into arbitrary relations to their products (in virtue of the alleged arbitrariness of the genetic code). I argue that the deflationary theory does not capture four essential features of the ordinary concept of genetic information: intentionality, exclusiveness, asymmetry, and causal relevance. It is therefore further removed from what is customarily meant by genetic information than Sarkar admits. Moreover, I argue that it is questionable whether the account succeeds in demonstrating that information is theoretically useful in molecular genetics. Introduction Sarkar's Information System The Pre-theoretic Features of Genetic Information 3.1 Intentionality 3.2 Exclusiveness 3.3 Asymmetry 3.4 Causal relevance Theoretical Usefulness Conclusion;
BibTeX:
@article{stegmann_dna_2009,
  author = {Stegmann, Ulrich E.},
  title = {DNA, Inference, and Information},
  journal = {The British Journal for the Philosophy of Science},
  year = {2009},
  volume = {60},
  number = {1},
  pages = {1--17},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwhV1LT8MwDLbQTrsAG6-NofXAAaSNpU3bNMcJmOACCIHELUrc9ICmMm1Dgn-Psz7QhjTOaazKiR-x_dkAPLhiww2dkMZSC2FJKp3JzLKQ7LJgEWLsuo8Ysx7ZrmtqNhP6ko_M-2wx0l85W6HVyTi5sQXPj2-1-o1jWbRedjE48h9LUN7G3jUzVCrjNYRbpZJXdmayB9VA16q-pE46_8Li_9Zfb___fdgtvU5vXFyTFuzYvA3Np2qMwXcbWqWQL7yLshP15QH0bx7GA—wgQOPJ2nXglgcgd6CK-T25fru2E5UWGIq1iob0IdCW6FTjRHyaRJbBCgTqQbwKGRWcxMGFryKzASmcgkMmkz-pTcpjjl_Aga-UduT8DLEp4GEmmV3rWGh4Y8QcasLzHSqAPswHnFYDUrGmeoIuHNleOBKnjQgWPHfOXEaTnXqEgjBEQxjjrQp_PYvrdXnZXSxkWGcLlQDnNOLwYmiHS9nE6nyo_JWyX1Jfzuv6RPoVmkjVyxWQ8ay_mnPSs6Nf4AwjDRRg}
}
Sarkar, S. Does “Information” Provide a Compelling Framework for a Theory of Natural Selection? Grounds for Caution 2014 Philosophy of Science
Vol. 81(1), pp. 22-30 
article URL 
Abstract: Frank has recently argued for an information-theoretic interpretation of natural selection. This interpretation is based on the identification of a measure related to the Malthusian parameter ((for population change)) with the Jeffreys divergence between the present allelic distribution of the population and that distribution in the next generation. It is pointed out in this analysis that this identification only holds if the mean fitness of the population is a constant, that is, there is no selection. This problem is used to argue for the superiority of the standard dynamical interpretation of natural selection over its information-theoretic counterpart.; Frank has recently argued for an information-theoretic interpretation of natural selection. This interpretation is based on the identification of a measure related to the Malthusian parameter (for population change) with the Jeffreys divergence between the present allelic distribution of the population and that distribution in the next generation. It is pointed out in this analysis that this identification only holds if the mean fitness of the population is a constant, that is, there is no selection. This problem is used to argue for the superiority of the standard dynamical interpretation of natural selection over its information-theoretic counterpart. [PUBLICATION ABSTRACT]; Frank has recently argued for an information-theoretic interpretation of natural selection. This interpretation is based on the identification of a measure related to the Malthusian parameter ((for population change)) with the Jeffreys divergence between the present allelic distribution of the population and that distribution in the next generation. It is pointed out in this analysis that this identification only holds if the mean fitness of the population is a constant, that is, there is no selection. This problem is used to argue for the superiority of the standard dynamical interpretation of natural selection over its information-theoretic counterpart.; Frank has recently argued for an information-theoretic interpretation of natural selection. This interpretation is based on the identification of a measure related to the Malthusian parameter (for population change) with the Jeffreys divergence between the present allelic distribution of the population and that distribution in the next generation. It is pointed out in this analysis that this identification only holds if the mean fitness of the population is a constant, that is, there is no selection. This problem is used to argue for the superiority of the standard dynamical interpretation of natural selection over its information-theoretic counterpart.
BibTeX:
@article{sarkar_does_2014,
  author = {Sarkar, Sahotra},
  title = {Does “Information” Provide a Compelling Framework for a Theory of Natural Selection? Grounds for Caution},
  journal = {Philosophy of Science},
  year = {2014},
  volume = {81},
  number = {1},
  pages = {22--30},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB4hkFAlBGwLaQqVDOIAhyxJJrGTU4VSVpVAaMVD4hZ5Y7uXarc06aG3_pD2z_WXMLaT0C4UODqx85zHN7bnGwBMp3G0ZhNMrgSWxqg0bsiHlOSVpCmMkbxQpRZrM9vAblnQL_hbTrGc4_9EgVYRP1fzcdUgKWJveTGJijQrrtUS8sNGu-v3Ht5ElTYN5LRdr0LyR4fknM_sEQxVuYZNJ-NK9K9c-d83Zd_6Uo_hYY9C2TsvNhO4o5ebsDEfyhqcbcKkV_qWve6Zqd9swWp_RQdeXp1f9FlM9q9S85LNfUIfk8waGO2Ivtls2PjFqDOd8jwAbGXYJ-noPtgXV4WHLrK3x-ws2FK1rm8lnT48gW-z91-rg6iv2BAdWiq3SOVx3JB9FU2iZaYNoimUoiCs4bnihMQomlHCyKQsNHLM6LUJoEhJqHUh8kWJT-GBtDv7l53LAFQB3DOkhjqwrjGg7x7A_e_lx_3i4EPlm5OhOW1dmtr0RxeQPDgtjvhUbAPTAnmjcaEkfee4FDKRBCObRHKK3lCUIQRWKmqr0t2JbGrkhJuRXEgIW05Q6mNP_VH7H0UDBrmp1dFRTUYRHdRNQtj2YjSOQJFSTItZHsLuDcGqe2PRjhd98dfz9bEyIby6LozjTWxki0lKwMPijzSE5H-6VT0LvGU_6Hb-8XjPYIOwYuZnn57D3e7kVO967sqfXSonug}
}
Woschni, E.-G. Dynamics of measurement-relations to system and information theory 1977 Journal of Physics E: Scientific Instruments
Vol. 10(11), pp. 1081 
article URL 
Abstract: The article begins by pointing out the requirements placed on the dynamics of measurement due to advanced computing techniques and automation engineering. This is followed by a survey of basic terms such as root mean square deviation and other error definitions, their correlations and interpretations with the aid of modern geometry in Euclidean and non-Euclidean spaces. Relations between data storage and measuring errors are shown by use of the findings of the information theory. Furthermore, methods for the analysis and synthesis, i.e. optimisation of measuring systems are discussed.
BibTeX:
@article{woschni_dynamics_1977,
  author = {Woschni, E.-G.},
  title = {Dynamics of measurement-relations to system and information theory},
  journal = {Journal of Physics E: Scientific Instruments},
  year = {1977},
  volume = {10},
  number = {11},
  pages = {1081},
  url = {http://stacks.iop.org/0022-3735/10/i=11/a=001}
}
Giddings, S. and Shi, Y. Effective field theory models for nonviolent information transfer from black holes 2014 PHYSICAL REVIEW D
Vol. 89(12) 
article  
Abstract: Transfer of quantum information from the interior of a black hole to its atmosphere is described, in models based on effective field theory. This description illustrates that such transfer need not be violent to the semiclassical geometry or to infalling observers, and in particular can avoid producing a singular horizon or "firewall". One can specifically quantify the rate of information transfer and show that a rate necessary to unitarize black hole evaporation produces a relatively mild modification to the stress tensor near the horizon. In an exterior description of the transfer, the new interactions responsible for it are approximated by "effective sources" acting on fields in the black hole atmosphere. If the necessary interactions couple to general modes in the black hole atmosphere, one also finds a straightforward mechanism for information transfer rates to increase when a black hole is mined, avoiding paradoxical behavior. Correspondence limits are discussed, in the presence of such new interactions, for both small black holes and large ones; the near-horizon description of the latter is approximately that of Rindler space.
BibTeX:
@article{giddings_effective_2014,
  author = {Giddings, SB and Shi, YB},
  title = {Effective field theory models for nonviolent information transfer from black holes},
  journal = {PHYSICAL REVIEW D},
  year = {2014},
  volume = {89},
  number = {12}
}
Blazsó, T. Elementary information in physics 2011 Physics Essays
Vol. 25(1), pp. 83-90 
article URL 
Abstract: The paper is an attempt to discover the roots of information in physics. The usual, entropy-based information is a special case, which is closely related to the quantum physical bound state. On the other hand, free particles produce increasing information. A short review of the present theories on the relation of the physical quantities to information and possible future tasks are also included. The present paper will not investigate fully the relationship between physical and communication information but outlines some similarities and differences. [DOI: 10.4006/0836-1398-25.1.83]; The paper is an attempt to discover the roots of information in physics. The usual, entropy-based information is a special case, which is closely related to the quantum physical bound state. On the other hand, free particles produce increasing information. A short review of the present theories on the relation of the physical quantities to information and possible future tasks are also included. The present paper will not investigate fully the relationship between physical and communication information but outlines some similarities and differences. (C) 2012 Physics Essays Publication. [DOI: 10.4006/0836-1398-25.1.83]
BibTeX:
@article{blazso_elementary_2011,
  author = {Blazsó, Tibor},
  title = {Elementary information in physics},
  journal = {Physics Essays},
  year = {2011},
  volume = {25},
  number = {1},
  pages = {83--90},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1bS8MwFD54RxDv1nmBCfrYLkvSLHkQGXNjor754Ftp09QHYc61A_33njRdvYPgW1NC6clJzvnONQCMBsT_IhPaLKPE6BTRvJKK8JTzkLCM06STaGniz55tuKw9GaPHwErK9xKHeDzOW-mTbnWpcnGDdqvfvbq9-HChxvTcFqvPwyLCF2FP61W_W1thqHFLx4tktgGfki7UiVtZtOp3Pg2DdmBbCVZyesnWiEzzH7VTqYkGGzBzSM0yUOqw9Hvh_PcM7X9RuAnrFWxtdt0-24I5M9qG5TJ9VOc7cNKvstAnr82qFatlOD43nesk34W7Qf-uN_Sryxf8B8mYlTuZQqYprgzXMuE8RbEZi5hLYQhnScZFakIbp-No1SBOC5kmIpYIGMMwTNgerMU2R39UlLV8qQeLGR4o41kl5-GiebByr24u5fC654Zbs2GQlwVnwXPhIdNKgn0RdPahmQhbCCsSaRD5aaJlxwiWSBQiKeloFTfAs2sZWUqLSayjegUbcOZYGI1dF4-IRjmNSCSRIBoyqiSJipeiAftf5jHEsIhi7CdOPzK_nlCalQwNU1kC6ga0_zKtV7Vgt60HioNf__sQVhGiUZf1dgQLxWRqjl3LyDfB3_d7}
}
Cover, T.M. and Thomas, J.A. Elements of information theory 2006   book URL 
BibTeX:
@book{cover_elements_2006,
  author = {Cover, T. M. and Thomas, Joy A.},
  title = {Elements of information theory},
  publisher = {Wiley-Interscience},
  year = {2006},
  edition = {2nd},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1Lb8IwDLbGuAwhMWAIBkz5AyCauK8zAu2yG_eqadMjQypI7N_PTtINEBzjHCJbcvyQv88ASi5Xi5s_QYZKp5rS09wUUYUpb6zRZZQixe9kZaExF53te3XjFQC96WBwuYEJ_cAtKvIuvNJxR2GoosCi0GOKYkxt1jDu-DPzZzL441RfsfDZ0LJ9hWeGG_ThyewH0PNJofAuV5Oo2bvQyAbQ-frjWq2H8LFxE-C1-K6E50FlawuLUfx5g_l2s1t_LvjlzHdrMu11OssRdHOect8fLRquHIPApIopH8DSYIFpKXVhFBahzoNckTrhBMZOn-zguCoyqhiCmLIMOYHR_XfeH11M4eW_3zCDdkX-YObOUL9NhYVz}
}
Carnap, R. Empiricism, Semantics and Ontology. 1950 Revue Internationale de Philosophie
Vol. 4(11), pp. 20-40 
article  
BibTeX:
@article{Carnap1950,
  author = {Carnap, R.},
  title = {Empiricism, Semantics and Ontology.},
  journal = {Revue Internationale de Philosophie},
  year = {1950},
  volume = {4},
  number = {11},
  pages = {20-40}
}
Carnap, R. Empiricism, semantics, and ontology 2011 , pp. 249-264  incollection URL 
BibTeX:
@incollection{carnap_empiricism_2011,
  author = {Carnap, Rudolf},
  title = {Empiricism, semantics, and ontology},
  year = {2011},
  pages = {249--264},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV3NS8MwFH8IXoYe_GTzAwsyT6ukybK2B08yEfQgOs-hNikM7FZoB_rf-97SpWthd49paNK-PN7L-_o9AMHvmd-RCZKHwVhqihkFSWYEajkytZNYMh1L89X2bLv-Xc2zf37wnRsxDYecOffrOpSAnLFuOfe-0svvbJsjpnkxR6lHqRY4X5ociVqnCZH3nIAMnItdN45N4dviYmcOsgmKLhEy2RJJFhK01m7cYoa3gac7CsGl6aG5EZDNIe8IhzzX87R6MAv_8wNVHsH4kA388upcWoyAfeKYbF_3ITXGkRsTImu6LFbllhafHcEBVXZ4VHKB-x_DnlmcQO9t08zh9xRuGxqNPEehkYf08Tb0OYObp-ns8dm3O6jC4nUo9xv8HA4TKhygl5GOug9eRPE8noapjtNxREgJ1LMULz2Z0YmYpAPo71puAMPulCq5YipCqSgI916EqvqpLnYvcQm95iyvYD9D1jPXFhLyD4jE_40}
}
McIrvine, E.C. and Tribus, M. Energy and Information 1971 Scientific American
Vol. 225(3), pp. 179-88. 
article  
BibTeX:
@article{mcirvine_energy_1971,
  author = {McIrvine, Edward C and Tribus, Myron},
  title = {Energy and Information},
  journal = {Scientific American},
  year = {1971},
  volume = {225},
  number = {3},
  pages = {179--88.}
}
Gray, R.M. Entropy 2011 Entropy and Information Theory, pp. 61-95  incollection  
BibTeX:
@incollection{gray_entropy_2011-1,
  author = {Gray, Robert M.},
  title = {Entropy},
  booktitle = {Entropy and Information Theory},
  publisher = {Springer US},
  year = {2011},
  pages = {61--95}
}
Barnum H., J.B.L.O.C.M.L.R.S.N.S.A.W. and Wilke, R. Entropy and information causality in general probabalistic theories 2009 New Journal of Physics
Vol. 75, pp. 032304 
article URL 
BibTeX:
@article{Barnum2009,
  author = {Barnum, H., Jonathan Barrett, Lisa Orloff Clark, Matthew Leifer, Robert Spekkens, Nicholas Stepanik, Alex Wilce, and Robin Wilke},
  title = {Entropy and information causality in general probabalistic theories},
  journal = {New Journal of Physics},
  year = {2009},
  volume = {75},
  pages = {032304},
  url = {http://arxiv.org/abs/0909.5075}
}
Barnum, H., Barrett, J., Clark, L.O., Leifer, M., Spekkens, R., Nicholas Stepanik, Wilce, A. and Wilke, R. Entropy and information causality in general probabilistic theories 2012 New Journal of Physics
Vol. 14(12), pp. 129401 
article URL 
Abstract: In this addendum to our paper (2010 New J. Phys. 12 033024), we point out that an elementary consequence of the strong subadditivity inequality allows us to strengthen one of the main conclusions of that paper.
BibTeX:
@article{barnum_entropy_2012,
  author = {Barnum, Howard and Barrett, Jonathan and Clark, Lisa Orloff and Leifer, Matthew and Spekkens, Robert and Nicholas Stepanik and Wilce, Alex and Wilke, Robin},
  title = {Entropy and information causality in general probabilistic theories},
  journal = {New Journal of Physics},
  year = {2012},
  volume = {14},
  number = {12},
  pages = {129401},
  url = {http://stacks.iop.org/1367-2630/14/i=12/a=129401}
}
Gray, R.M. Entropy and Information Theory 2009   book  
BibTeX:
@book{Gray2009,
  author = {Gray, Robert M},
  title = {Entropy and Information Theory},
  publisher = {Springer},
  year = {2009}
}
Robinson, D.W. Entropy and Uncertainty 2010 Entropy
Vol. 10, pp. 493-506 
article  
BibTeX:
@article{Robinson2010,
  author = {Robinson, Derek W},
  title = {Entropy and Uncertainty},
  journal = {Entropy},
  year = {2010},
  volume = {10},
  pages = {493-506}
}
Oppenheim, I. Entropy, Information, and the Arrow of Time 2010 Journal of Physical Chemistry B
Vol. 114(49), pp. 16184–16188 
article  
Abstract: We shall investigate the relationships between the thermodynamic entropy and information theory and the implications that can be drawn for the arrow of time. This demands a careful study of classical thermodynamics and a review of its fundamental concepts. The statistical mechanical properties of time-dependent systems will be carefully studied, and the point at which the arrow of time appears will be described.
BibTeX:
@article{Oppenheim2010,
  author = {Oppenheim, I.},
  title = {Entropy, Information, and the Arrow of Time},
  journal = {Journal of Physical Chemistry B},
  year = {2010},
  volume = {114},
  number = {49},
  pages = {16184–16188}
}
Beni, M.D. Epistemic Informational Structural Realism 2016 Minds and Machines
Vol. 26(4), pp. 323-339 
article  
Abstract: The paper surveys Floridi’s attempt for laying down informational structural realism (ISR). After considering a number of reactions to the pars destruens of Floridi’s attack on the digital ontology, I show that Floridi’s enterprise for enriching the ISR by borrowing elements from the ontic form of structural realism (in the pars construens) is blighted by a haunting inconsistency. ISR has been originally developed by Floridi as a restricted and level dependent form of structural realism which remains mainly bonded within the borders of a Kantian perspective. I argue that this perspective doesn’t mesh nicely with the ontic interpretation that Floridi attached to the ISR. I substantiate this claim through the assessment of Floridi’s strategy for reconciling the epistemic and ontic forms of the SR, as well as by close examination of his use of method of levels of abstraction and his notion of semantic information. My proposal is that the ISR could be defended best against the mentioned and similar objections by being interpreted as an extension of the epistemic SR.;The paper surveys Floridi’s attempt for laying down informational structural realism (ISR). After considering a number of reactions to the pars destruens of Floridi’s attack on the digital ontology, I show that Floridi’s enterprise for enriching the ISR by borrowing elements from the ontic form of structural realism (in the pars construens) is blighted by a haunting inconsistency. ISR has been originally developed by Floridi as a restricted and level dependent form of structural realism which remains mainly bonded within the borders of a Kantian perspective. I argue that this perspective doesn’t mesh nicely with the ontic interpretation that Floridi attached to the ISR. I substantiate this claim through the assessment of Floridi’s strategy for reconciling the epistemic and ontic forms of the SR, as well as by close examination of his use of method of levels of abstraction and his notion of semantic information. My proposal is that the ISR could be defended best against the mentioned and similar objections by being interpreted as an extension of the epistemic SR.;The paper surveys Floridi’s attempt for laying down informational structural realism (ISR). After considering a number of reactions to the pars destruens of Floridi’s attack on the digital ontology, I show that Floridi’s enterprise for enriching the ISR by borrowing elements from the ontic form of structural realism (in the pars construens) is blighted by a haunting inconsistency. ISR has been originally developed by Floridi as a restricted and level dependent form of structural realism which remains mainly bonded within the borders of a Kantian perspective. I argue that this perspective doesn’t mesh nicely with the ontic interpretation that Floridi attached to the ISR. I substantiate this claim through the assessment of Floridi’s strategy for reconciling the epistemic and ontic forms of the SR, as well as by close examination of his use of method of levels of abstraction and his notion of semantic information. My proposal is that the ISR could be defended best against the mentioned and similar objections by being interpreted as an extension of the epistemic SR.;
BibTeX:
@article{beni_epistemic_2016,
  author = {Beni, Majid D.},
  title = {Epistemic Informational Structural Realism},
  journal = {Minds and Machines},
  year = {2016},
  volume = {26},
  number = {4},
  pages = {323--339}
}
Dretske, F. Epistemology and Information 2008 , pp. 29-47  incollection URL 
BibTeX:
@incollection{dretske_epistemology_2008,
  author = {Dretske, Fred},
  title = {Epistemology and Information},
  year = {2008},
  pages = {29--47},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV3NS8MwFA-CIKKg86ObH9DLvLVm_Up68OLcGGw3J3grbZOAh81pK_jn-17TdLWyu4ceXgtp8pK-jzTv9yPE91zqdGwCh7jBS3kWjOCCxIWD44WvLBCU5j5T8vfOdkPrur33zye-ExGjOPRos_2K8einLAt93mZqSDjrBTHZ4IyutphLdSVSMzMm9-et3F_jXgWIL8a8igywsUFxy5tpOMs_dlKn7I8aXhcacapWnNBFfgTm8K13MH_EO06jOcoHKQlFgB12h1jlK_GWlw9y7bw8g1tEqB_Mk-eLZtuLgtcEF1hhA5i-GxykWg4NNuwout_dQ4R2zd83X0UrHFiekiMsEbGxdgM62SN7cn1GTgxBhl3by3Ny09a5DTq3Wzq_IMvpZDmeOTUdhSMRLwsCoTTK8kwIJkIVR0rmIZ7Uk8JnXKSxpwImR8gOkPmUZl4WqtznTKWZ9ISEsPaSHKdYtbAuq-pGYZF9BStNWuj2LRiBRQ5e48UTn83HWuwZ0S2qEjz3o7QgyqgWqhO5rE_sCN6ahmkcKkUDxf1UiTgPYiqRTEhRMSB9raZko9FLkmbCBmTYfZQUXkITxAiMkViAR0n5XV7tbuKaHG6X5U09nluNaPkDSScsyQ}
}
Siegfried, T. Erasing any doubt that information is physical 2012
Vol. 181(7) 
book  
BibTeX:
@book{siegfried_erasing_2012,
  author = {Siegfried, Tom},
  title = {Erasing any doubt that information is physical},
  year = {2012},
  volume = {181},
  number = {7}
}
Shor, P.W. Erratum: Equivalence of additivity questions in quantum information theory (Communication in Mathematical Physics (2004) 246 (453-472)) 2004 Communications in Mathematical Physics
Vol. 246(3), pp. 473 
article  
BibTeX:
@article{shor_erratum:_2004,
  author = {Shor, Peter W.},
  title = {Erratum: Equivalence of additivity questions in quantum information theory (Communication in Mathematical Physics (2004) 246 (453-472))},
  journal = {Communications in Mathematical Physics},
  year = {2004},
  volume = {246},
  number = {3},
  pages = {473}
}
Muneyuki, E., Toyabe, S., Sagawa, T., Sano, M. and Ueda, M. Experimental demonstration of information-to-energy conversion and validation of the generalized Jarzynski equality 2010 Nature Physics
Vol. 6(12), pp. 988-992 
article URL 
Abstract: In 1929, Leó Szilárd invented a feedback protocol in which a hypothetical intelligence-dubbed Maxwell's demon-pumps heat from an isothermal environment and transforms it into work. After a long-lasting and intense controversy it was finally clarified that the demon's role does not contradict the second law of thermodynamics, implying that we can, in principle, convert information to free energy. An experimental demonstration of this information-to-energy conversion, however, has been elusive. Here we demonstrate that a non-equilibrium feedback manipulation of a Brownian particle on the basis of information about its location achieves a Szilárd-type information-to-energy conversion. Using real-time feedback control, the particle is made to climb up a spiral-staircase-like potential exerted by an electric field and gains free energy larger than the amount of work done on it. This enables us to verify the generalized Jarzynski equality, and suggests a new fundamental principle of an 'information-to-heat engine' that converts information into energy by feedback control.; In 1929, Leo Szilard invented a feedback protocol in which a hypothetical intelligence–dubbed Maxwell's demon–pumps heat from an isothermal environment and transforms it into work. After a long-lasting and intense controversy it was finally clarified that the demon's role does not contradict the second law of thermodynamics, implying that we can, in principle, convert information to free energy. An experimental demonstration of this information-to-energy conversion, however, has been elusive. Here we demonstrate that a non-equilibrium feedback manipulation of a Brownian particle on the basis of information about its location achieves a Szilard-type information-to-energy conversion. Using real-time feedback control, the particle is made to climb up a spiral-staircase-like potential exerted by an electric field and gains free energy larger than the amount of work done on it. This enables us to verify the generalized Jarzynski equality, and suggests a new fundamental principle of an "information-to-heat engine" that converts information into energy by feedback control. [PUBLICATION ABSTRACT]; In 1929, Leo Szilard invented a feedback protocol in which a hypothetical intelligence–dubbed Maxwell's demon–pumps heat from an isothermal environment and transforms it into work. After a long-lasting and intense controversy it was finally clarified that the demon's role does not contradict the second law of thermodynamics, implying that we can, in principle, convert information to free energy. An experimental demonstration of this information-to-energy conversion, however, has been elusive. Here we demonstrate that a non-equilibrium feedback manipulation of a Brownian particle on the basis of information about its location achieves a Szilard-type information-to-energy conversion. Using real-time feedback control, the particle is made to climb up a spiral-staircase-like potential exerted by an electric field and gains free energy larger than the amount of work done on it. This enables us to verify the generalized Jarzynski equality, and suggests a new fundamental principle of an "information-to-heat engine" that converts information into energy by feedback control. [PUBLICATION ABSTRACT];
BibTeX:
@article{muneyuki_experimental_2010,
  author = {Muneyuki, Eiro and Toyabe, Shoichi and Sagawa, Takahiro and Sano, Masaki and Ueda, Masahito},
  title = {Experimental demonstration of information-to-energy conversion and validation of the generalized Jarzynski equality},
  journal = {Nature Physics},
  year = {2010},
  volume = {6},
  number = {12},
  pages = {988--992},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LS8NAEB60Injxra1WWLzHZh_JJidRsYhnBW9hn1KoSW3rof317ubVVvEihEDY7CE7w8yXeXwDQMlNGPywCYLpSFnDNFNCM-eySKKiNEqEdJA6jO1mZBsaVoha2o2RLC23LpQPmg8Sh1VSTzh2O_kM_BQpn22tR2psww72xN5Ov_nbfWOZ_d8CrRoko4AwThumIZoMch9IcFgbb_inTXbNlZ0unc_wAJpqqKbopM1Er3rlfxdl__ejDmG_BqnortKqI9gy-THslsWianYCs8e1sQBImw-PMStNQoVFNRerfwzmRWDK7kJUlreXsTkkco2cgo90u8WhUPReEWCPlkajZzFdLvxIbWSqrs_FKbwOH18enoJ6eEOgqIMZgbSSSC1sqLmOnKcUxGJ3qUQSJhzsYMxSymKPULSKpCWpAw421phgEeOE0zPo5EVuuoAU5yl2SFBJ7dCdDhPpNnOsYmwNd7ceXDciyyYVR0dW5tZpkrVy7UG3Emb7ytpSvxFGJqQ_MDWfZa0oenDeLuvxOCMkpB42U3zx58ol7JG27qUPnfn0y1xVTI_fgRby2g}
}
Freedman, S.J. and Clauser, J.F. Experimental Test of Local Hidden-Variable Theories 1972 Physical Review Letters
Vol. 28 (14), pp. 938 
article  
BibTeX:
@article{,
  author = {Freedman, Stuart J.; Clauser, John F},
  title = {Experimental Test of Local Hidden-Variable Theories},
  journal = {Physical Review Letters},
  year = {1972},
  volume = {28 (14)},
  pages = {938}
}
Dretske, F.I. and EBSCOhost Explaining behavior: reasons in a world of causes 1988   book  
BibTeX:
@book{dretske_explaining_1988,
  author = {Dretske, Fred I. and EBSCOhost},
  title = {Explaining behavior: reasons in a world of causes},
  publisher = {MIT Press},
  year = {1988}
}
Dierckx, B., Fannes, M. and Pogorzelska, M. Fermionic quasifree states and maps in information theory 2008 Journal of Mathematical Physics
Vol. 49(3), pp. 032109-032109-18 
article  
Abstract: This paper and the results therein are geared toward building a basic toolbox for calculations in quantum information theory of quasifree fermionic systems. Various entropy and relative entropy measures are discussed. The main emphasis is on completely positive quasifree maps. The set of quasifree affine maps on the state space is determined and fully characterized in terms of operations on one-particle subspaces. For a subclass of trace-preserving completely positive maps and for their duals, Choi matrices and Jamiolkowski states are discussed. (C) 2008 American Institute of Physics.;This paper and the results therein are geared toward building a basic toolbox for calculations in quantum information theory of quasifree fermionic systems. Various entropy and relative entropy measures are discussed. The main emphasis is on completely positive quasifree maps. The set of quasifree affine maps on the state space is determined and fully characterized in terms of operations on one-particle subspaces. For a subclass of trace-preserving completely positive maps and for their duals, Choi matrices and Jamiolkowski states are discussed.;This paper and the results therein are geared toward building a basic toolbox for calculations in quantum information theory of quasifree fermionic systems. Various entropy and relative entropy measures are discussed. The main emphasis is on completely positive quasifree maps. The set of quasifree affine maps on the state space is determined and fully characterized in terms of operations on one-particle subspaces. For a subclass of trace-preserving completely positive maps and for their duals, Choi matrices and Jamiolkowski states are discussed.;
BibTeX:
@article{dierckx_fermionic_2008,
  author = {Dierckx, B. and Fannes, M. and Pogorzelska, M.},
  title = {Fermionic quasifree states and maps in information theory},
  journal = {Journal of Mathematical Physics},
  year = {2008},
  volume = {49},
  number = {3},
  pages = {032109--032109--18}
}
Sherry, D. Fields and the Intelligibility of Contact Action 2015 PHILOSOPHY
Vol. 90(3), pp. 457-477 
article  
Abstract: This article concerns arguments for the impossibility of contact action and, subsequently, the use of force fields to render intelligible apparent cases of contact action. I argue that instead of unraveling the mystery of contact action, fields only deepen the mystery. Further, I show that there is a confusion underlying arguments for the impossibility of contact and present an analysis of contact, based upon Korner's treatment of empirical continuity, which restores intelligibility to apparent cases of contact action.; This article concerns arguments for the impossibility of contact action and, subsequently, the use of force fields to render intelligible apparent cases of contact action. I argue that instead of unraveling the mystery of contact action, fields only deepen the mystery. Further, I show that there is a confusion underlying arguments for the impossibility of contact and present an analysis of contact, based upon Körner's treatment of empirical continuity, which restores intelligibility to apparent cases of contact action.;
BibTeX:
@article{sherry_fields_2015,
  author = {Sherry, D.},
  title = {Fields and the Intelligibility of Contact Action},
  journal = {PHILOSOPHY},
  year = {2015},
  volume = {90},
  number = {3},
  pages = {457--477}
}
Sun, F., Hu, D. and Liu, H. Foundations and Practical Applications of Cognitive Systems and Information Processing: Proceedings of the First International Conference on Cognitive Systems and Information Processing, Beijing, China, Dec 2012 (CSIP2012) 2013
Vol. 215 
book URL 
BibTeX:
@book{sun_foundations_2013,
  author = {Sun, Fuchun and Hu, Dewen and Liu, Huaping},
  title = {Foundations and Practical Applications of Cognitive Systems and Information Processing: Proceedings of the First International Conference on Cognitive Systems and Information Processing, Beijing, China, Dec 2012 (CSIP2012)},
  publisher = {Springer},
  year = {2013},
  volume = {215},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV3BagIxEB2qvRSE1raibYX5AUvMJrPmKFLpBxSvEk1yVLD6_2ayWXVFjyEhJIFkZl7mvQEo5LcYXb0JZeGNZXErJyhIDrui3VOl1coaOXbURLZvxY0NAnqNYChDypSmBa0Y5F3cygSwsDOkuY5DQZz7EV2NSaW4c2rrhvhesijzF3j0TDPowoPfvMJzXVwB8117g8W54NE_xnAfK2WheKQ4vfh0xm3AWZ0ChFl_PI3PRCMehZkOEM3UOwznP3-z3xGvaJnBm-Uqb5GKHnQsJ71v9okc5_qApVxp4bzXbrxWEyKjQyAXXUBvhQiOBtC7PdnHvY5PeJKp3gNjDF_Q3u8Oflgd0hGXzoQW}
}
Floridi, L. From data to semantic information 2003 Entropy
Vol. 5(2), pp. 125-145 
article URL 
Abstract: There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates the important implications of the revised definition for the analysis of the deflationary theories of truth, the standard definition of knowledge and the classic, quantitative theory of semantic information.
BibTeX:
@article{floridi_data_2003,
  author = {Floridi, Luciano},
  title = {From data to semantic information},
  journal = {Entropy},
  year = {2003},
  volume = {5},
  number = {2},
  pages = {125--145},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V1La9wwEBZ9QCmE0JDU2SYFF0ovwYstybJ06KFJugSSWxLIzVgzcskh2STrhfz8zKxfm6X9Az1KNrb4ZpiHNPNJCCWnabJhE0IGpqbQVIYCSUcsBSl5AF1V5OACmenXO9vDXazj3P8g-Bk3jHDhJ4eVi3BH0N3CUUeQOoih34znQvWRVGDG9XjYNkwveb9j_mpXQI3VS50hY-ZNrdL2zCP8Za6zfvmakOWaJcvafuTOKWY6Hx1Gf0i-4UeG6j7ml-FQ8Qezl9_hLTQ_w31yffmWUmPLCfLp-WxIipVcXU87rK27O4GZW2H-sFysefurT2K7C9PjXy28O-JNuN8V3xjamKGNm3ncQxuvQbsnrme_r07Oku6KiQSYoT1BiocrtNLZUCmwhdIm9TYUuc9MyA165qTytgak1yDIWjmDRYHoKRNUyqjPYqviVgT-H8kCI_G-JvUJEfvyiNYdiQ837uLUnp2ftMOdfjhdrPrqpo9NRGCstC8x02JfxDWkdZVbAJSpBjReAkgvUWkvg6v9RBwzRuVDS0hSMkX4amL-9KfsFK90ABZtFpzSoJ3RNjNI8bynr6GrczMR31uEh8_IciHLtJROU0xLmaspm-dmIqKN13rxfvnnkwPxcdTJQ_GueVqGry3P5QtvOUK9}
}
McMullin, E. From matter to materialism … and (almost) back 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 13-37  incollection  
BibTeX:
@incollection{mcmullin_matter_2010,
  author = {McMullin, Ernan},
  title = {From matter to materialism … and (almost) back},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {13--37},
  note = {DOI: 10.1017/CBO9780511778759.002}
}
Jackson, F. From Metaphysics to Ethics: A Defence of Conceptual Analysis. 2003   book  
BibTeX:
@book{Jackson2003,
  author = {Jackson, F.},
  title = {From Metaphysics to Ethics: A Defence of Conceptual Analysis.},
  publisher = {Oxford University Press},
  year = {2003}
}
Siegfried, T. FROM THE EDITOR: Erasing any doubt that information is physical 2012
Vol. 181(7) 
book  
BibTeX:
@book{siegfried_editor:_2012,
  author = {Siegfried, Tom},
  title = {FROM THE EDITOR: Erasing any doubt that information is physical},
  year = {2012},
  volume = {181},
  number = {7}
}
Knuth, D.E. Fundamental Algorithms 1973   inbook  
BibTeX:
@inbook{knuth-fa,
  author = {Donald E. Knuth},
  title = {Fundamental Algorithms},
  publisher = {Addison-Wesley},
  year = {1973}
}
Borda, M. Fundamentals in information theory and coding 2011   book  
BibTeX:
@book{borda_fundamentals_2011,
  author = {Borda, Monica},
  title = {Fundamentals in information theory and coding},
  publisher = {Springer-Verlag},
  year = {2011}
}
Stegmann, U.E. Genetic Information as Instructional Content 2005 Philosophy of Science
Vol. 72(3), pp. 425-443 
article  
BibTeX:
@article{stegmann_genetic_2005,
  author = {Stegmann, Ulrich E},
  title = {Genetic Information as Instructional Content},
  journal = {Philosophy of Science},
  year = {2005},
  volume = {72},
  number = {3},
  pages = {425--443}
}
Griffiths, P.E. Genetic Information: A Metaphor in Search of a Theory 2001 Philosophy of Science
Vol. 68(3), pp. 394-412 
article  
BibTeX:
@article{Griffiths2001,
  author = {Griffiths, P. E.},
  title = {Genetic Information: A Metaphor in Search of a Theory},
  journal = {Philosophy of Science},
  year = {2001},
  volume = {68},
  number = {3},
  pages = {394-412}
}
Primiero, G. Giovanni Sommaruga (ed): Formal Theories of Information: From Shannon to Semantic Information Theory and General Concepts of Information: Lecture Notes in Computer Science, vol. 5363, Springer, New York, 2009, vii+269, $ 64.95, ISBN 978-3-642-00658-6 2011 Minds and Machines
Vol. 21(1), pp. 119-122 
article URL 
BibTeX:
@article{primiero_giovanni_2011,
  author = {Primiero, Giuseppe},
  title = {Giovanni Sommaruga (ed): Formal Theories of Information: From Shannon to Semantic Information Theory and General Concepts of Information: Lecture Notes in Computer Science, vol. 5363, Springer, New York, 2009, vii+269, $ 64.95, ISBN 978-3-642-00658-6},
  journal = {Minds and Machines},
  year = {2011},
  volume = {21},
  number = {1},
  pages = {119--122},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lj9MwEB4hhBASYlkeoTwkX5AAkShx4jy4oUJZCU4UEDfLrywr2GRpUwn-AT-bGTtply4HkHpolLHjdL7OfPZ4xgA5T9J4zya0XBetrcrGVhYduqCyY1Yjs9NVaZQv5XluZRv4diWj-5pMAUpvt3epb1RzIKYFvoZTkWW0wuiqCOPvl5-2cQT6-Gp7vIhLnAtMcc2_9fCHZxrt8xXKDdmsL4RJvfdZHMCUFTPtOtmGonfJ8hd3Zf_HW92EGyMxZS8Dkg7hkutuwcF06AMbbcBt-PWGtq923Qlb9pT3tjlW7ImzT1-wBdHfb8xn–P8m_UtG5OdSPl4f9WfsuUXbNt3bOjZ0p2iWk_MeanQ-idTnWVjRWw2D5mV-x3egY-L1x_mR_F4nENskEXVsUXrIlxjdJO6vHTIVbhuGiW4VYUqRG5bwVVdaKGN4w55jM5adKVFa6wTVS7yu3Bd0bZ_GhoScRvB5WG1cRG5zQhVEsHVz827V_XR23m4PJwuk7VPYUu-DxEiw__D4zKp7gFTVaZNgVDkxhZZltcEyrKkc4Ez46yewbMJFPIs1P-Qu0rPpC2J2pKkLZnOICLYSLINw0oZSfXhKi5yMYPHAUnbTrhcc5lKtJcCZ6z4-qkcfgzYw54cErKiRpo1g-cTZHYD8c8nCidHdISBnNmWHrgv7gXHNvjFy97_R7kHcC2sptNGnofhh38USlf-BnhSIdM}
}
Ward, K. God as the ultimate informational principle 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 282-300  incollection  
BibTeX:
@incollection{ward_god_2010,
  author = {Ward, Keith},
  title = {God as the ultimate informational principle},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {282--300},
  note = {DOI: 10.1017/CBO9780511778759.013}
}
Chaitin, G.J. Gödel's theorem and information 1982 International Journal of Theoretical Physics
Vol. 21(12), pp. 941-954 
article DOI URL 
Abstract: We suggest that the Einstein equation can be derived from Landauer's principle applied to an information erasing process at a local Rindler horizon and Jacobson's idea linking the Einstein equation with thermodynamics. When matter crosses the horizon, information on the matter disappears, and the horizon entanglement entropy increases to compensate for the entropy reduction. The Einstein equation describes an information-energy relation during this process, which implies that entropic gravity is related to the quantum entanglement of the vacuum and has a quantuminformation theoretic origin.
BibTeX:
@article{chaitin_gos_1982,
  author = {Chaitin, Gregory J.},
  title = {Gödel's theorem and information},
  journal = {International Journal of Theoretical Physics},
  year = {1982},
  volume = {21},
  number = {12},
  pages = {941--954},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V07T8MwELagEhIL4imeUiQGplSO4yT2wAAVtDME1ip-RHRoQLT9a_wB_hh2fW5MsjCzRokl-6y773L3fYdQSoY47vgEoqistAmVdUrrtcyUzEWiKk0zlpjkrvNn288BbJ_9B8OPbe37Ple2_F4sgKk4B5WlDVcxBKX9v4IAUcuA5bhuFW0740dv1Qz0B4DmAgUm5Sh1jATdGL6jHxuU7RQvhxocYUFinrmpJt5TkiS8ESTwe9ypV0EI5U4Xuuedse85J5gZ4MDbGOTr7p3QtGkY9OLKwbdWGn2uZnJ5q5v45Xnb5N3MZt9Pk9dWdRlTF4phg7_1aWGlAJEE0KLcR3tw4NGds-UB2tLNIdqBAz9C0fj7y1jzZhGBLSNjyyiw5TEqHx_K0SSGwRaxNPlbrHlSU6mIFCyr6qLQShPOhHGushAGkDLKc4mFMr4_sVo8ibD8Z451lRe8Vio9QYPmvdGnKDJw0ZLmspxLQTM7m1GY1SlPGdUkFfkZuvZ7m344-ZJp_yzP__TWBdptb88lGiw_V_rKqVf-AM71Mkc},
  doi = {http://doi.org/10.3938/jkps.63.1094}
}
Ercan, I. and Anderson, N.G. Heat Dissipation in Nanocomputing: Lower Bounds From Physical Information Theory 2013 IEEE Transactions on Nanotechnology
Vol. 12(6), pp. 1047-1060 
article  
BibTeX:
@article{ercan_heat_2013,
  author = {Ercan, Ilke and Anderson, Neal G.},
  title = {Heat Dissipation in Nanocomputing: Lower Bounds From Physical Information Theory},
  journal = {IEEE Transactions on Nanotechnology},
  year = {2013},
  volume = {12},
  number = {6},
  pages = {1047--1060}
}
Reeb, D., Kastoryano, M.J. and Wolf, M.M. Hilbert's projective metric in quantum information theory 2011 Journal of Mathematical Physics
Vol. 52(8), pp. 082201-082201-33 
article  
Abstract: We introduce and apply Hilbert's projective metric in the context of quantum information theory. The metric is induced by convex cones such as the sets of positive, separable or positive partial transpose operators. It provides bounds on measures for statistical distinguishability of quantum states and on the decrease of entanglement under protocols involving local quantum operations and classical communication or under other cone-preserving operations. The results are formulated in terms of general cones and base norms and lead to contractivity bounds for quantum channels, for instance, improving Ruskai's trace-norm contraction inequality. A new duality between distinguishability measures and base norms is provided. For two given pairs of quantum states we show that the contraction of Hilbert's projective metric is necessary and sufficient for the existence of a probabilistic quantum operation that maps one pair onto the other. Inequalities between Hilbert's projective metric and the Chernoff bound, the fidelity and various norms are proven. [PUBLICATION ABSTRACT];We introduce and apply Hilbert's projective metric in the context of quantum information theory. The metric is induced by convex cones such as the sets of positive, separable or positive partial transpose operators. It provides bounds on measures for statistical distinguishability of quantum states and on the decrease of entanglement under protocols involving local quantum operations and classical communication or under other cone-preserving operations. The results are formulated in terms of general cones and base norms and lead to contractivity bounds for quantum channels, for instance, improving Ruskai's trace-norm contraction inequality. A new duality between distinguishability measures and base norms is provided. For two given pairs of quantum states we show that the contraction of Hilbert's projective metric is necessary and sufficient for the existence of a probabilistic quantum operation that maps one pair onto the other. Inequalities between Hilbert's projective metric and the Chernoff bound, the fidelity and various norms are proven. (C) 2011 American Institute of Physics. [doi: 10.1063/1.3615729];We introduce and apply Hilbert's projective metric in the context of quantum information theory. The metric is induced by convex cones such as the sets of positive, separable or positive partial transpose operators. It provides bounds on measures for statistical distinguishability of quantum states and on the decrease of entanglement under protocols involving local quantum operations and classical communication or under other cone-preserving operations. The results are formulated in terms of general cones and base norms and lead to contractivity bounds for quantum channels, for instance, improving Ruskai's trace-norm contraction inequality. A new duality between distinguishability measures and base norms is provided. For two given pairs of quantum states we show that the contraction of Hilbert's projective metric is necessary and sufficient for the existence of a probabilistic quantum operation that maps one pair onto the other. Inequalities between Hilbert's projective metric and the Chernoff bound, the fidelity and various norms are proven. [PUBLICATION ABSTRACT];We introduce and apply Hilbert's projective metric in the context of quantum information theory. The metric is induced by convex cones such as the sets of positive, separable or positive partial transpose operators. It provides bounds on measures for statistical distinguishability of quantum states and on the decrease of entanglement under protocols involving local quantum operations and classical communication or under other cone-preserving operations. The results are formulated in terms of general cones and base norms and lead to contractivity bounds for quantum channels, for instance, improving Ruskai's trace-norm contraction inequality. A new duality between distinguishability measures and base norms is provided. For two given pairs of quantum states we show that the contraction of Hilbert's projective metric is necessary and sufficient for the existence of a probabilistic quantum operation that maps one pair onto the other. Inequalities between Hilbert's projective metric and the Chernoff bound, the fidelity and various norms are proven.;We introduce and apply Hilbert's projective metric in the context of quantum information theory. The metric is induced by convex cones such as the sets of positive, separable or positive partial transpose operators. It provides bounds on measures for statistical distinguishability of quantum states and on the decrease of entanglement under protocols involving local quantum operations and classical communication or under other cone-preserving operations. The results are formulated in terms of general cones and base norms and lead to contractivity bounds for quantum channels, for instance, improving Ruskai's trace-norm contraction inequality. A new duality between distinguishability measures and base norms is provided. For two given pairs of quantum states we show that the contraction of Hilbert's projective metric is necessary and sufficient for the existence of a probabilistic quantum operation that maps one pair onto the other. Inequalities between Hilbert's projective metric and the Chernoff bound, the fidelity and various norms are proven.;
BibTeX:
@article{reeb_hilberts_2011,
  author = {Reeb, David and Kastoryano, Michael J. and Wolf, Michael M.},
  title = {Hilbert's projective metric in quantum information theory},
  journal = {Journal of Mathematical Physics},
  year = {2011},
  volume = {52},
  number = {8},
  pages = {082201--082201--33}
}
Lengagne, T., Aubin, T., Lauga, J. and Jouventin, P. How do king penguins (Aptenodytes patagonicus apply the mathematical theory of information to communicate in windy conditions? 1999 Proceedings of the Royal Society of London. Series B: Biological Sciences
Vol. 266(1429), pp. 1623-1628 
article  
Abstract: In the king penguin (
BibTeX:
@article{lengagne_how_1999,
  author = {Lengagne, T. and Aubin, T. and Lauga, J. and Jouventin, P.},
  title = {How do king penguins (Aptenodytes patagonicus apply the mathematical theory of information to communicate in windy conditions?},
  journal = {Proceedings of the Royal Society of London. Series B: Biological Sciences},
  year = {1999},
  volume = {266},
  number = {1429},
  pages = {1623--1628}
}
Sarkar, S. How Genes Encode Information for Phenotypic Traits 2006 Molecular Models of Life: Philosophical Papers on Molecular Biology, pp. 261-286  incollection  
BibTeX:
@incollection{sarkar_how_2006,
  author = {Sarkar, Sahotra},
  title = {How Genes Encode Information for Phenotypic Traits},
  booktitle = {Molecular Models of Life: Philosophical Papers on Molecular Biology},
  publisher = {MIT Press},
  year = {2006},
  pages = {261--286}
}
Floridi, L. How to account for information 2010 , pp. 1-15  incollection URL 
BibTeX:
@incollection{floridi_how_2010,
  author = {Floridi, Luciano},
  title = {How to account for information},
  year = {2010},
  pages = {1--15},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV1LT8MwDI6QkBACCcaj450LXFBLm_SRHjgN0KTtxpC4TU2TSjuwDtb9f-IkfVBpd45updZ1LNuf6wdClHi-27MJMuM54TwpRJT7EWFFIJgvUhZKHtNE5H8z280y1_baPz_4XkQM5D3xm_SrPobyZyFMx_MGEhZlVyf0_rjyMTMbInSFoR2c2hyPTQDoOrI6AaBxoIrZ4pQA1OkYoqDj0Uy_ZN9WKmDFbHlE4OpHqKeGLvH0arLWPdS_xHteo6nlU5gEQGEYP8Cw8i-xyKtnuXQ_3pVfhFk_AJQn0ybv5VMFKRls22r5tqO3apqYOjxg8GkLezDYNS9Xm3UnGJgdowNoEMHQuaE4HKAduTxBR_V6DGyt5Sm6VeLGVYmtuLESNO6I-wzdvb3ORmPXvGK-MnM_5s2XknN0mEEDwrLSjYrCQbuFUhrpgAd3FDsO2vtMpy9sPBkZclCT3lp303nflaMCBq1zbuwlQ4Q5ZVxFW7KgFDIFMg0iHnKSpSIjNGH-BRpuY-hy-60rtN8qzbXl88YMnfwFeIkT1g}
}
Greco, G.M., Paronitti, G., Turilli, M. and Floridi, L. How to do philosophy informationally 2005
Vol. 3782, pp. 623-634 
inproceedings URL 
BibTeX:
@inproceedings{greco_how_2005,
  author = {Greco, Gian M. and Paronitti, Gianluca and Turilli, Matteo and Floridi, Luciano},
  title = {How to do philosophy informationally},
  year = {2005},
  volume = {3782},
  pages = {623--634},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV1LS-RAEC5EUVwW1Fk3PlbJQb0MM2TSM8nk4MH1wYAPBBX2FjrpBAbc6dlJD_jzt6srncS43veYF1SS6urqr-v7CoD5fa_Xign5IBkLz48EyxPBwiRNoiANdDY7SPiQD9MWsm27J9bn_vMf38qI_zn53JWbAg9SmVqrru3bUA1jnVNOZ-nr0tBZCh04UOu46H548GJh6ohIjaMh3IlQ-4ebf05lqcGqTOlR1PRF07dOdoXszm3nBKQbVsRJ_tooKsUlukHrMfTccyqDtgxtjEBUgICXsWNbXSCyQOiI-EdKZbL2TLmYCiKBLxHDke-QjlED6bALXkSpcC-XhKct8UsHdb0sojiZURwPUJ2RkRqqDfQspD5HZawOfNaY9oMSU32nyN2aKav6RaQT69STnaE–28xTdV5Nuu9POlUAOWMUNP0alJBfUgUG6GqYZkgoGYjbW6R3SXlyLxXuF6hcfSeKGGbyvmyaKQ9z9uwWxNC3cfK23ZgJZt1YMOSJjqwZZ3MLZ2sA18aSpff4ES7gKukK6Rbu4DbcoFdOL65fr6c9MiUeE5KKLH9Duw7fOXIyJgpw9wUDqzlehRlDqY0jrbagY1f0d3VeHJ7SYc79rBfGHph_49y9Acyg7AX9MM9cPVUMQpDLnju8SHLB-MwYSIa6lM-16tmbx9OW_b4ceHHXsxYaIQl9VIgVm9qH5xP7D749MohbNYu-ANW1WKZHZFU51-JdoWV}
}
Chaitin, G.J. How to run algorithmic information theory on a computer:Studying the limits of mathematical reasoning 1996 Complexity
Vol. 2(1), pp. 15-21 
article URL 
BibTeX:
@article{chaitin_how_1996,
  author = {Chaitin, Gregory J.},
  title = {How to run algorithmic information theory on a computer:Studying the limits of mathematical reasoning},
  journal = {Complexity},
  year = {1996},
  volume = {2},
  number = {1},
  pages = {15--21},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V3dT9swELdGJyF4mAYDMdgkvxS1QimJ3bRN2ZAgK1qlTUV8iTfLSd2BBAmi5YH_njt_NSqP4wkpD4mt9OL417uLffc7QjhrhcGCTuhxrPHA47bMuhMW5_mYS7g94bHKwQDJhZVtV2hz3vYeJl4XiSv3Hp-KPXn3r4SP_xuMf7cMqXq6dfbiM-4TSB1TjmUd4IkxpvDZpU_dYeaTjvO498yuugiAnOoV3KpXi0oFiTVnfn0-vZG3lp7AZsHY_aexzbjr-AgqpxPDbidgXWsXlW1D8oLYJLs7Rcpe4cUoRZOvac2ryYd-pbgNESw41efDdFhniReAdFXIIJPoJQboYUgyUedpFMMJHEfDX0F6-ucaLge8FbbSUZ0fs2AwN3BuU3_B7vloRMPczIRA2bglnwiULIxcaBBMRCKKhQBZQssSXIQiHUHHYBc52u_Ht_nspyqCy_MlshT3ujXy8fjsGnOZHAF0R_O3-pe5TA7tuPcbKLfpR9wwcuG1NFk_-hHF_b4f46EZ4QGMr-JIVTyii8_kk_2UoUcGgmvkgyrWyepfj5bpOlmzpmNKG5bfvPmFKEAonZUUEEorCKUVhFKDUApnkjqE9h0-sZcafNJyQqv4pB6fG-TqZHCR_g7c44sHQ6wi_m8i-CapFWWhtggNE4ZEl5K38147y1iWJeCqqXHYUUhXNflKRm8sfPvNf3GHrMz_jd9Ibfb4pL4bstAXoPOkIQ}
}
Brooks, M. If Information.. Then Universe 2012
Vol. 215(2884) 
book  
Abstract: Brooks discusses that the universe is a computer and everything that goes on in it can be explained in terms of information processing. The connection between reality and computing may not be immediately obvious, but strip away the layers and that's exactly what some researchers think people find. People think of the world as made up of particles held together by forces, but quantum theory tells them that these are just a mess of field they can only properly describe by invoking the mathematics of quantum physics.; Brooks discusses that the universe is a computer and everything that goes on in it can be explained in terms of information processing. The connection between reality and computing may not be immediately obvious, but strip away the layers and that's exactly what some researchers think people find. People think of the world as made up of particles held together by forces, but quantum theory tells them that these are just a mess of field they can only properly describe by invoking the mathematics of quantum physics.;
BibTeX:
@book{brooks_if_2012,
  author = {Brooks, Michael},
  title = {If Information.. Then Universe},
  year = {2012},
  volume = {215},
  number = {2884}
}
Barwise, J. and Seligman, J. Imperfect information flow 1993 , pp. 252-260  inproceedings URL 
Abstract: The view that computers are information processors is commonplace. They are used, for the most part successfully, throughout our society, as reliable links in the transmission of information and knowledge. Yet the formulation of a precise, qualitative conception of information and a theory of the transmission of information has proved elusive, despite the many other successes of computer science. The authors set out the motivation for and a skeleton of a new mathematical model of information flow, one that is compatible with less than perfect flow.
BibTeX:
@inproceedings{barwise_imperfect_1993,
  author = {Barwise, J. and Seligman, J.},
  title = {Imperfect information flow},
  year = {1993},
  pages = {252--260},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1NS8NAEB1EURShNtbYaiEHr0k3n909V0uLFQQ9eAubdHMqtSYt_n1nNtmWKih4yxAIm93svMnOmzcAYeAx95tP8BPF50WE2BxIIQohEVgE-WemCkSsYP9kG0zXO-p3oqloyqNLndlHuxrIrErlYqFlBWXZNM_A4D_m5I6p-Q5-2sl0d9zCYkRa4WsFSJ8nIf5XJI3-jrFjk8VkYjCbjl6okC_06ofu9V7R0DNugclCGMrJNg-9q5T_Scn-zytdQGdX_-c8b5GtDQdqaUHLNIBwGn9gwdnTVvS1suBYs0nz6hJ6UwzGSyKKOI0wKy2_UyzePztwN354HU1cGla6qrUuUqajyCQinTg1VCK8gnNJ3PvlWtfozW04KnCjKJvAy8a5seHkTczu-eRxVJttY3qVLiTzPtY2YqXeZ27iDa_BkdLPOJNcBT6J27Asw6hCKl_lPMplLrrQ3R8VqycmXc2LLvR_HXHvj_s3cFqTGOlY5RYO1-VG9WtRxi_Agco3}
}
Campbell, D.E. Incentives: motivation and the economics of information 2006   book  
BibTeX:
@book{campbell_incentives:_2006,
  author = {Campbell, Donald E.},
  title = {Incentives: motivation and the economics of information},
  publisher = {Cambridge University Press},
  year = {2006},
  edition = {2nd}
}
Kohlas, J. and Schneuwly, C. Information algebra 2009
Vol. 5363, pp. 95-127 
inproceedings  
BibTeX:
@inproceedings{kohlas_information_2009,
  author = {Kohlas, Jürg and Schneuwly, Cesar},
  title = {Information algebra},
  year = {2009},
  volume = {5363},
  pages = {95--127}
}
Küppers, B.-O. Information and communication in living matter 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 170-184  incollection  
BibTeX:
@incollection{kuppers_information_2010,
  author = {Küppers, Bernd-Olaf},
  title = {Information and communication in living matter},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {170--184},
  note = {DOI: 10.1017/CBO9780511778759.009}
}
Harms, W.F. Information and meaning in evolutionary processes 2004   book URL 
BibTeX:
@book{harms_information_2004,
  author = {Harms, William F.},
  title = {Information and meaning in evolutionary processes},
  publisher = {Cambridge University Press},
  year = {2004},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1NSwMxEB2sghQ8WLWpH4X8gV262Ww-ztVS0IMHD95KtpuABxelRfDfO7PJgiv1OMlhH2Gyj3nMmwCUIl9kf_4JpS-CVEFYRw8lOVFXrhKywcrbFq4KeqhsH6obBwb0XsFA7qWRYyMYYZH361bG2VHaWCOTP9Ugs0mRJu70sR4M3-sYZXUOx-QymMCRby9g_Nw_KPB9CUVyCNGJcazz-bt3JF7wt5b7r5QpiJR_xB5_v7uC-erhZbnO6CubJMhs6gR7IaZw5qiRvd13hreGwUnArPOMmIAhIganr_bp3qwflzGc9GG-61xZ-eeeIfF0SZupXM-AWyl8UF4FI7fUPVorXzpnHa65rdHNNUwPw7n5b-MWxrFhhZSHu4RxHs_uB5lGg-A}
}
Bassi, A., Ghosh, S. and Singh, T. Information and the foundations of quantum theory 2015 It From Bit or Bit From It?, pp. 87-95  incollection  
BibTeX:
@incollection{bassi_information_2015,
  author = {Bassi, Angelo and Ghosh, Saikat and Singh, Tejinder},
  title = {Information and the foundations of quantum theory},
  booktitle = {It From Bit or Bit From It?},
  publisher = {Springer},
  year = {2015},
  pages = {87--95}
}
Davies, P.C.W. and Gregersen, N.H. Information and the nature of reality: from physics to metaphysics 2010   book  
BibTeX:
@book{davies_information_2010,
  author = {Davies, P. C. W. and Gregersen, Niels H.},
  title = {Information and the nature of reality: from physics to metaphysics},
  publisher = {Cambridge University Press},
  year = {2010}
}
Plastino, A. and Plastino, A.R. Information and thermal physics 2007 , pp. 119-154  incollection URL 
BibTeX:
@incollection{plastino_information_2007,
  author = {Plastino, Angelo and Plastino, Angelo R.},
  title = {Information and thermal physics},
  year = {2007},
  pages = {119--154},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV1LS8QwEA6CID5Ad9WuL-xFby1J-szB06os7N5cwduStgnsYR-y9f8706QPK969NQ2EdDKdzEzm-0JIwH3q9WxCpCUVOlGaCS0Y10HOuFQyEyKKZFyR7Xcy2w2vTfvuny98zyPG5gOnTfoVLRw4w-VybbArWK-6-bsHQfWgH5uu1lh0UlkXKaN7uEKwVrWcP3MESSdHUIWKEMYhVJJaw2dsFbO2ymx7zHA5_7KobREF86pRwB-H8Rdhu4HUh-a9faWp9oOohVdXIj0infmqWOblk1p772-wcyIbEIbS01mTGUNS_CAIDd7fTtuSc9XtuOGPMhTBvakh7Wu-2X7tOq7C_IwcI3zERVwHzG5A9tR6SE7ryzNca0uH5KjDBHlO7juCd0HwrhW8awV_QeavL_PxxLN3V3i7hIWeKNIsD_C0XaUsySGCKUTGIVpUUqpYUMUUp7HWRZBjCFsUYBZ1lFNFCxnCf8UvyYlEiMO6rKCQhUP2NailctBHcOCTHHLwIWbP6WQ6Ns1B3fR3FV7P_ywdcEkqrfZiPxkRV9NMqjSNeJCLkCt4DFMJ8XIoeRJlUl-RkZHbYmuoThbN0l3_3XVDDludu7XzvDO0lt9O2DIH}
}
Al-Safi, S.W. and Short, A.J. Information causality from an entropic and a probabilistic perspective 2011 Phys. Rev. A
Vol. 84(4), pp. 042323 
article DOI URL 
BibTeX:
@article{al-safi_information_2011,
  author = {Al-Safi, Sabri W. and Short, Anthony J.},
  title = {Information causality from an entropic and a probabilistic perspective},
  journal = {Phys. Rev. A},
  year = {2011},
  volume = {84},
  number = {4},
  pages = {042323},
  url = {http://link.aps.org/doi/10.1103/PhysRevA.84.042323},
  doi = {http://doi.org/10.1103/PhysRevA.84.042323}
}
Floridi, L. Information closure and the sceptical objection 2014 Synthese
Vol. 191(6), pp. 1037-1050 
article URL 
Abstract: In this article, I define and then defend the principle of information closure (pic) against a sceptical objection similar to the one discussed by Dretske in relation to the principle of epistemic closure. If I am successful, given that pic is equivalent to the axiom of distribution and that the latter is one of the conditions that discriminate between normal and non-normal modal logics, a main result of such a defence is that one potentially good reason to look for a formalization of the logic of " S S is informed that p p " among the non-normal modal logics, which reject the axiom, is also removed. This is not to argue that the logic of " S S is informed that p p " should be a normal modal logic, but that it could still be insofar as the objection that it could not be, based on the sceptical objection against pic, has been removed. In other word, I shall argue that the sceptical objection against pic fails, so such an objection provides no ground to abandon the normal modal logic B (also known as KTB) as a formalization of " S S is informed that p p ", which remains plausible insofar as this specific obstacle is concerned.; (ProQuest: ... denotes formulae and/or non-USASCII text omitted; see image) In this article, I define and then defend the principle of information closure (pic) against a sceptical objection similar to the one discussed by Dretske in relation to the principle of epistemic closure. If I am successful, given that pic is equivalent to the axiom of distribution and that the latter is one of the conditions that discriminate between normal and non-normal modal logics, a main result of such a defence is that one potentially good reason to look for a formalization of the logic of "... is informed that ..." among the non-normal modal logics, which reject the axiom, is also removed. This is not to argue that the logic of "... is informed that ..." should be a normal modal logic, but that it could still be insofar as the objection that it could not be, based on the sceptical objection against pic, has been removed. In other word, I shall argue that the sceptical objection against pic fails, so such an objection provides no ground to abandon the normal modal logic B (also known as KTB) as a formalization of "... is informed that ...", which remains plausible insofar as this specific obstacle is concerned.[PUBLICATION ABSTRACT]; In this article, I define and then defend the principle of information closure (pic) against a sceptical objection similar to the one discussed by Dretske in relation to the principle of epistemic closure. If I am successful, given that pic is equivalent to the axiom of distribution and that the latter is one of the conditions that discriminate between normal and non-normal modal logics, a main result of such a defence is that one potentially good reason to look for a formalization of the logic of “ $$S$$ S is informed that $$p$$ p ” among the non-normal modal logics, which reject the axiom, is also removed. This is not to argue that the logic of “ $$S$$ S is informed that $$p$$ p ” should be a normal modal logic, but that it could still be insofar as the objection that it could not be, based on the sceptical objection against pic, has been removed. In other word, I shall argue that the sceptical objection against pic fails, so such an objection provides no ground to abandon the normal modal logic B (also known as KTB) as a formalization of “ $$S$$ S is informed that $$p$$ p ”, which remains plausible insofar as this specific obstacle is concerned.; In this article, I define and then defend the principle of information closure (pic) against a sceptical objection similar to the one discussed by Dretske in relation to the principle of epistemic closure. If I am successful, given that pic is equivalent to the axiom of distribution and that the latter is one of the conditions that discriminate between normal and non-normal modal logics, a main result of such a defence is that one potentially good reason to look for a formalization of the logic of " is informed that " among the non-normal modal logics, which reject the axiom, is also removed. This is not to argue that the logic of " is informed that " should be a normal modal logic, but that it could still be insofar as the objection that it could not be, based on the sceptical objection against pic, has been removed. In other word, I shall argue that the sceptical objection against pic fails, so such an objection provides no ground to abandon the normal modal logic B (also known as KTB) as a formalization of " is informed that ", which remains plausible insofar as this specific obstacle is concerned.
BibTeX:
@article{floridi_information_2014,
  author = {Floridi, Luciano},
  title = {Information closure and the sceptical objection},
  journal = {Synthese},
  year = {2014},
  volume = {191},
  number = {6},
  pages = {1037--1050},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9NAEB4hkFAlBKSAMVDJB5BAyMbrXb-OKG1VCQ6Il7it1vtAgsoNiXPg3zOzXrsR6aE9bnYSJflmZ-bzPBaAF1me_mcTmo4ZU3a5Eg6dAnrF0ghWV7nT2ubK18vuPNmGYn6S0f_OpgSlt9uXrW-sKKjSh6cU9abE2tFVkY5__vJ9ziMgGQiDGdu0bsp6ymte9Qn7RpnGh-qL1Xazkya90kF5Z3T6AKYmmakIZc5MX_bO7xdp3-BHPoT7IU5N3o-KtYBbtj-Eg0_TxQd_D2ERzMImeR1mV795BO9CexPBnWiU3a5tonqTYJyZbHwJDSpFctH98iVg_WP4dnrydXmWhjsZ0p9MFCw1xinVIQ9yWjDhSrSzVrW2MWg5labLYwwXqm7aEll3hWxIVZ2rHeuqtqEAjT-Be4pq9_vB9_iZCO44PGg2IucX4T8Zwd0f7cfj5uzDclwupmW28Y1o2Z8hQnz9OU2rrH4KibBCFUprLTQTrOlwwQ0SM-saLnJnY4gIWkmHdlgrLXlFTbg5r3FnQlua83PJaRhNi8yd4c4IvlyNcz9kWaDiClbE8HaCa97zSEmMprgMyEhETK6Mi-HVnjTKifAW5MuyItkYXu4q1ixLrBVjbGqnooA8BnYdsWWY8E6TDYZn1_wKz-GAXh4Lk17A7WG9tUfjfMp_GDoWpA}
}
WEISS, O., JIMÉNEZ-MONTAÑO, M.A. and HERZEL, H. Information Content of Protein Sequences 2000 Journal of Theoretical Biology
Vol. 206(3), pp. 379-386 
article URL 
Abstract: The complexity of large sets of non-redundant protein sequences is measured. This is done by estimating the Shannon entropy as well as applying compression algorithms to estimate the algorithmic complexity. The estimators are also applied to randomly generated surrogates of the protein data. Our results show that proteins are fairly close to random sequences. The entropy reduction due to correlations is only about 1%. However, precise estimations of the entropy of the source are not possible due to finite sample effects. Compression algorithms also indicate that the redundancy is in the order of 1%. These results confirm the idea that protein sequences can be regarded as slightly edited random strings. We discuss secondary structure and low-complexity regions as causes of the redundancy observed. The findings are related to numerical and biochemical experiments with random polypeptides. Copyright 2000 Academic Press;The complexity of large sets of non-redundant protein sequences is measured. This is done by estimating the Shannon entropy as well as applying compression algorithms to estimate the algorithmic complexity. The estimators are also applied to randomly generated surrogates of the protein data. Our results show that proteins are fairly close to random sequences. The entropy reduction due to correlations is only about 1%. However, precise estimations of the entropy of the source are not possible due to finite sample effects. Compression algorithms also indicate that the redundancy is in the order of 1%. These results confirm the idea that protein sequences can be regarded as slightly edited random strings. We discuss secondary structure and low-complexity regions as causes of the redundancy observed. The findings are related to numerical and biochemical experiments with random polypeptides.;
BibTeX:
@article{weiss_information_2000,
  author = {WEISS, OLAF and JIMÉNEZ-MONTAÑO, MIGUEL A. and HERZEL, HANSPETER},
  title = {Information Content of Protein Sequences},
  journal = {Journal of Theoretical Biology},
  year = {2000},
  volume = {206},
  number = {3},
  pages = {379--386},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LawMhEJYSKPTS9yN9wN7ayybu6qpLT32FHgttz6KrwvawCSU95N_XUbdpSnPJHmUEnQFndL_vG4RIOcL5nzOBcmsa7pyqClWXzmImSFNh43y5YBjTqy_bkRqTQJYpE8QTPpzdaWScfDuetS0wfksgYRIQESqIABHuigkA-U0e75cyvDQ0DQwYdrDuZRwxG3_MdRuIKyOYvy5NLSlPK1VoyEaTPdRDoHoUys-v6SV5_h-U9ua73Ee7qXDN7qLdAdqy3SHajq0sF0foJvGaIM5ZkLzq5tnUZS-gA9F22WsP2T5G75Ont4fnPHVhyJuyrFleNqwwoPlTaUWFhebklYEq0tciinFWaSdq58sg1lBTW9IwYpktONcWK15ocoIG3bSzZyjzpYAzitPaUkP9TVAJoWxTO4U1oZwVQ3TdO13OotiGjLLKTEJ4oGcmlrDxISr6mMgV10mfCdbOuU3BkypxDwLWWKqZt4x24c5UBqhf-Ij07sdx9mkM-K-V1QLk8843WMsF2omcfnjLuUSD-eeXvYpKkN_wXfJm}
}
Chatzisavvas, K.C., Moustakidis, C.C. and Panos, C.P. Information entropy, information distances, and complexity in atoms 2005 Journal of Chemical Physics
Vol. 123(17), pp. 174111 
article  
Abstract: Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z(2 textless= Z textless= 54) in atoms. Roothaan-Hartree-Fock electron wave functions are used. The universal property S = a+ b ln Z is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescu's information energy, and a complexity measure recently proposed. Shell effects at closed-shell atoms are observed. The complexity measure shows local minima at the closed-shell atoms indicating that for the above atoms complexity decreases with respect to neighboring atoms. It is seen that complexity fluctuates around an average value, indicating that the atom cannot grow in complexity as Z increases. Onicescu's information energy is correlated with the ionization potential. Kullback distance and Jensen-Shannon distance are employed to compare Roothaan-Hartree-Fock density distributions with other densities of previous works. (c) 2005 American Institute of Physics.Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z(2 textless= Z textless= 54) in atoms. Roothaan-Hartree-Fock electron wave functions are used. The universal property S=a+b ln Z is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescu's information energy, and a complexity measure recently proposed. Shell effects at closed-shell atoms are observed. The complexity measure shows local minima at the closed-shell atoms indicating that for the above atoms complexity decreases with respect to neighboring atoms. It is seen that complexity fluctuates around an average value, indicating that the atom cannot grow in complexity as Z increases. Onicescu's information energy is correlated with the ionization potential. Kullback distance and Jensen-Shannon distance are employed to compare Roothaan-Hartree-Fock density distributions with other densities of previous works.;Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z(2 textless or = Z textless or = 54) in atoms. Roothaan-Hartree-Fock electron wave functions are used. The universal property S = a + b ln Z is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescu's information energy, and a complexity measure recently proposed. Shell effects at closed-shell atoms are observed. The complexity measure shows local minima at the closed-shell atoms indicating that for the above atoms complexity decreases with respect to neighboring atoms. It is seen that complexity fluctuates around an average value, indicating that the atom cannot grow in complexity as Z increases. Onicescu's information energy is correlated with the ionization potential. Kullback distance and Jensen-Shannon distance are employed to compare Roothaan-Hartree-Fock density distributions with other densities of previous works.;Shannon information entropies in position and momentum spaces and their sum are calculated as functions of Z (Z=2-54) in atoms. Roothaan-Hartree-Fock electron wave functions are used. The universal property S=a+b lnZ is verified. In addition, we calculate the Kullback-Leibler relative entropy, the Jensen-Shannon divergence, Onicescu's information energy and a complexity measure recently proposed. Shell effects at closed shells atoms are observed. The complexity measure shows local minima at the closed shells atoms indicating that for the above atoms complexity decreases with respect to neighboring atoms. It is seen that complexity fluctuates around an average value, indicating that the atom cannot grow in complexity as Z increases. Onicescu's information energy is correlated with the ionization potential. Kullback distance and Jensen-Shannon distance are employed to compare Roothaan-Hartree-Fock density distributions with other densities of previous works.;
BibTeX:
@article{chatzisavvas_information_2005,
  author = {Chatzisavvas, K. C. and Moustakidis, Ch C. and Panos, C. P.},
  title = {Information entropy, information distances, and complexity in atoms},
  journal = {Journal of Chemical Physics},
  year = {2005},
  volume = {123},
  number = {17},
  pages = {174111}
}
Deutsch, D. and Hayden, P. Information flow in entangled quantum systems 2000 Proceedings of the Royal Society of London. Series A. Mathematical, Physical and Engineering Sciences
Vol. 456(1999), pp. 1759-1774 
article  
BibTeX:
@article{deutsch_information_2000,
  author = {Deutsch, David and Hayden, Patrick},
  title = {Information flow in entangled quantum systems},
  journal = {Proceedings of the Royal Society of London. Series A. Mathematical, Physical and Engineering Sciences},
  year = {2000},
  volume = {456},
  number = {1999},
  pages = {1759--1774}
}
Devlin, K. Information flow: the logic of distributed systems by Jon Barwise and Jerry Seligman 1998 Complexity
Vol. 4(2), pp. 30-32 
article URL 
BibTeX:
@article{devlin_information_1998,
  author = {Devlin, Keith},
  title = {Information flow: the logic of distributed systems by Jon Barwise and Jerry Seligman},
  journal = {Complexity},
  year = {1998},
  volume = {4},
  number = {2},
  pages = {30--32},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V1Rb9MwELaASRNoQnRAtgGSX4rah2SJnTZJgUltOrRuQ0WsoL5ZLXZEpa1jTWDw77mznTQqj-wJqQ-NlfZyzufz-Xz3mRDOPN_dsAkslnOu8DiRLIMBlSUcic7ibM7mEhyMcCOyXR5YtW77H168LTAyOYSX17dl5oY2cugbSuTKxWOuwNc0RM45OqGncP9gtrpd5GZH4VStVmBYMGX5ykKoZDTANHT1a1FU4fih-mkPfj9Ti-LbOpSga-uCKpRgrJ8fdV0W2RlQ2TakKeiYsvbSZIY1ZLCa-bNbLGYiNXHLv0y0oXwF9_lilI6aLKkEIDEVPJS24AHck4TQQazJU-7DF_j0R0M3_Xg-jZr8mHu-l46bfMDciiZ1zaS9McNVeYeGo5kJgbJx8z0RKFkYuSJgIhRMcF8IkCW0LMGFL9IxNPdfIxv7lVx8Ld6ppfv54j4s9WNY8G8NPk2xaqmkeu5qptaqM7fJkdX7sIVy25XGLSP3MGDtsMfecr_Xq3Q8Mhq-Af1qLlPN95k8IY_tooX2Ddga5J5a7pJHHyrG33yXNOwkkdOWZTJvPyWTGhYpYrFH4SdUI5FeZ7SGRGqRSOe_KSCRWiRSQCLVSKQlEp-RL–PJ-mJWz6t-G4YU8S_9Tt_TnZmWO-xLHRdqHTIVgZjVDnoMDnQHQ7Znibnw_jkLDWXjfLSy3XxondTOPB-9BB3u160R2jcgaWRSsBFlzKMeTCLOlJGrBsjAVKWqX0yvmM9Du78H1-Qh-tx_JI8KFY_1CtDKPoHWDWmQg}
}
Losee, R.M. Information from Processes: About the Nature of Information Creation, Use, and Representation 2012   book  
BibTeX:
@book{losee_information_2012,
  author = {Losee, Robert M.},
  title = {Information from Processes: About the Nature of Information Creation, Use, and Representation},
  publisher = {Springer},
  year = {2012}
}
Collier, J. Information in Biological Systems 2008 , pp. 763-787  incollection  
BibTeX:
@incollection{collier_information_2008,
  author = {Collier, John},
  title = {Information in Biological Systems},
  year = {2008},
  pages = {763--787}
}
Demirel, Y. Information in Biological Systems and the Fluctuation Theorem 2014 ENTROPY
Vol. 16(4), pp. 1931-1948 
article URL 
Abstract: Some critical trends in information theory, its role in living systems and utilization in fluctuation theory are discussed. The mutual information of thermodynamic coupling is incorporated into the generalized fluctuation theorem by using information theory and nonequilibrium thermodynamics. Thermodynamically coupled dissipative structures in living systems are capable of degrading more energy, and processing complex information through developmental and environmental constraints. The generalized fluctuation theorem can quantify the hysteresis observed in the amount of the irreversible work in nonequilibrium regimes in the presence of information and thermodynamic coupling.; Some critical trends in information theory, its role in living systems and utilization in fluctuation theory are discussed. The mutual information of thermodynamic coupling is incorporated into the generalized fluctuation theorem by using information theory and nonequilibrium thermodynamics. Thermodynamically coupled dissipative structures in living systems are capable of degrading more energy, and processing complex information through developmental and environmental constraints. The generalized fluctuation theorem can quantify the hysteresis observed in the amount of the irreversible work in nonequilibrium regimes in the presence of information and thermodynamic coupling.
BibTeX:
@article{demirel_information_2014,
  author = {Demirel, Y.},
  title = {Information in Biological Systems and the Fluctuation Theorem},
  journal = {ENTROPY},
  year = {2014},
  volume = {16},
  number = {4},
  pages = {1931--1948},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3Z3PS8MwFMeDeBJEFJXNH5CDeutM-7I0OXjQ6RA8uh08jaRJZeA2mdv_b17TddlQ8Oy1ZA3kG16_eXvvE0Ig67BkKyZoLpnNVG6V08qC9juBWWWLjCmdu6oJPs5sr67LXD_7D8LXDUarIsZAWQrNj4FO3hRNlh9LbB-pRoaGxslGoh6L2NfAgUc38eGxyhi_6ete91r19TxOGoQ2zY4LIQ6ZnBwY24iBItKaRwHN-7s0-jimKnAxtwMvgMJKRZcKxtc_ieHWWx-dphQQ_zZEl6BukHU-seNiceemyfAVsQA8xSq9x5d-c4aGLBUBEIVT3jYT1lcuIPC1mH0uv370EZVnGBySg9rs0_sg0hHZcdNjEgtEx1O6FojWAlEvEPWC0EggWgt0Qob9p0HvOakvsUhsxnKRGGZK6Z0C19yIotTgI7oUzvtk7UyegwVeAFIRc5NqK5QEELJw1nifKkrgcEr2NTY7TBdVU6RtEeqM4RmDLlJdubFOSY1gQn94lsYoptvkARdj9BmAJSNEiFcPZvP3Ub0xR5KVxvjobUshuBJOl1rm4FyXCYOUwTZphaVsXtNI1SZX8eI2A_CADd2KWok2qE3Svwzr1TB6hDAszn6f9Zzs4VYO-a8LsruYL91loGd-A360Y4U}
}
Godfrey-Smith, P. Information in Biology 2007 , pp. 103-119  incollection URL 
Abstract: INTRODUCTION The concept of information has acquired a strikingly prominent role in contemporary biology. This trend is especially marked within genetics, but it has also become important in other areas, such as evolutionary theory and developmental biology, especially where these fields border on genetics. The most distinctive biological role for informational concepts, and the one that has generated the most discussion, is in the description of the relations between genes and the various structures and processes that genes play a role in causing. For many biologists, the causal role of genes should be understood in terms of their carrying information about their various products. That information might require the cooperation of various environmental factors before it can be "expressed," but the same can be said of other kinds of message. An initial response might be to think that this mode of description is entirely anchored in a set of well-established facts about the role of DNA and RNA within protein synthesis, summarized in the familiar chart representing the "genetic code," mapping DNA base triplets to amino acids. However, informational enthusiasm in biology predates even a rudimentary understanding of these mechanisms (Schrodinger 1944). And more importantly, current applications of informational concepts extend far beyond anything that can receive an obvious justification in terms of the familiar facts about the specification of protein molecules by DNA.
BibTeX:
@incollection{godfrey-smith_information_2007,
  author = {Godfrey-Smith, Peter},
  title = {Information in Biology},
  publisher = {Cambridge University Press},
  year = {2007},
  pages = {103--119},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3LS8MwGP9QQRkKuum6-YDivV3btHkctToHDuZhB2-lzQO8bNNth_33Jm1apzLYMTRpST6a7_n7fQAo8gPvz52AhZIFyXmhklgx7XCHClMiCqoChVVJzLQV2YbBjoR-SAZpOhlbIn5tLGiHwS8Ztw-1JW5IHl-GTYTFkNFRhCxAtZzMyl5B2s3XSpBGln-neZNlgKqf0wbws-Or23QMhmmUzxfr5ZZ2Gp5DXX9cV6U0qeofMP3_qu19d30BpwYQ4RqkghZCGw7krAPHVRPLTQdab3U3hM0ldC28yYjb_Zi5dtYVTIfP03Tk2dYLHk8o8-K8iGXEqD5CxJJASANVwETFBUo4KSIsOY54QUIeYspzlVN9cySMCREQiQhFXTjLTYX-bFUi-YQDR6uvtXSMhnP06Thw8s7GT3T0mlbDdj30lyXazP9cOVqhlj-jh33SAzfOEdfWqAoM9peY9C0PEm2hSixkkoeiD_eNODLO51n6OPklzwfte_ahV8kpW1RsHhk26GRjal7vs_4GWlV01wRhbqtd3VUUjt_bfsJq}
}
Levy, A. Information in Biology: A Fictionalist Account 2011 Noûs
Vol. 45(4), pp. 640-657 
article  
BibTeX:
@article{levy_information_2011,
  author = {Levy, Arnon},
  title = {Information in Biology: A Fictionalist Account},
  journal = {Noûs},
  year = {2011},
  volume = {45},
  number = {4},
  pages = {640--657}
}
Page, D.N. Information in black hole radiation 1993 Physical Review Letters
Vol. 71(23), pp. 3743-3746 
article URL 
Abstract: If black hole formation and evaporation can be described by an S matrix, information would be expected to come out in black hole radiation. An estimate shows that it may come out initially so slowly, or else be so spread out, that it would never show up in an analysis perturbative in M(Planck)/M, or in 1/N for two-dimensional dilatonic black holes with a large number N of minimally coupled scalar fields.
BibTeX:
@article{page_information_1993,
  author = {Page, Don N.},
  title = {Information in black hole radiation},
  journal = {Physical Review Letters},
  year = {1993},
  volume = {71},
  number = {23},
  pages = {3743--3746},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8QwEB5EFATxLVsfUNCjXfNok-Yoi4sgexEFbyFNE_CyyG4Vf76Zttt9qLBe26GQTDqTSb7vGwDO-iRZiQk8o9a4kB6sE6YUzsu0yJ1X1tmwp62VlRZOtufc9eULfUr4LQIjn9wnkl36kvZ5yIBIXRcC2xfw0WsXhikTTRjmiEEgspUc-uMbS2lpCxkhH9Nfc1Gdd4b7MCM9zPAm3SX0nCb_E4-93ngOYK_djcZ3zfI5hA03PoLtGhVqp8dw1ZKV0Hnx2zgu8LQvxo668QQ1DfD5CbwM758HD0nbVSExISqyhAlrwx6iEEYoEcotm6FCoaKpyQrpiA3_fJn60lNvQmmUl0rI1BDjpC0d9Tbnp7BrEH0_rmqWXtmDuKCSWu6zzDibUssKFcoklQdfZ0Qwn0VwM5td_d6oaOi6-iBcL0yAllTjBETQazzQGTOFhWdYKBFcr77SU6aJThn2cedMCV19VcFs0XWdMYr88tGjEikmahYBXcds0Mqlo0xAdfa_sZzDTo2RrBEwF7BZTT7cZaP5-A01H-zp}
}
Sarkar, S. Information in Genetics and Developmental Biology: Comments on Maynard Smith 2000 Philosophy of Science
Vol. 67(2), pp. 208-213 
article  
BibTeX:
@article{sarkar_information_2000,
  author = {Sarkar, Sahotra},
  title = {Information in Genetics and Developmental Biology: Comments on Maynard Smith},
  journal = {Philosophy of Science},
  year = {2000},
  volume = {67},
  number = {2},
  pages = {208--213}
}
Kamp, H. and Stokhof, M. Information in Natural Language 2008 , pp. 49-111  incollection  
BibTeX:
@incollection{kamp_information_2008,
  author = {Kamp, Hans and Stokhof, Martin},
  title = {Information in Natural Language},
  year = {2008},
  pages = {49--111}
}
Bekenstein, J.D. Information in the Holographic Universe 2003
Vol. 289(2) 
book URL 
Abstract: By studying the mysterious properties of black holes, physicists have reduced absolute limits on how much information a region of space or a quantity of matter and energy can hold. The holographic principle holds that the principle is like a hologram, and that the seemingly three-dimensional universe could be completely equivalent to alternative quantum fields and physical laws painted on a distant, vast surface.
BibTeX:
@book{bekenstein_information_2003,
  author = {Bekenstein, Jacob D.},
  title = {Information in the Holographic Universe},
  year = {2003},
  volume = {289},
  number = {2},
  url = {http://www.phys.huji.ac.il/ bekenste/Holographic_Univ.pdf}
}
Landauer, R. Information is a Physical Entity 1999 Physica B: Condensed Matter (Amsterdam, Netherlands)
Vol. 92(1), pp. 63-67 
article  
Abstract: This paper, associated with a broader conference talk on the fundamental physical limits of information handling, emphasizes the aspects still least appreciated. Information is not an abstract entity but exists only through a physical representation, thus tying it to all the restrictions and possibilities of our real physical universe. The mathematician's vision of an unlimited sequence of totally reliable operations is unlikely to be implementable in this real universe. Speculative remarks about the possible impact of that, on the ultimate nature of the laws of physics are included.
BibTeX:
@article{Landauer1999,
  author = {Landauer, R.},
  title = {Information is a Physical Entity},
  journal = {Physica B: Condensed Matter (Amsterdam, Netherlands)},
  year = {1999},
  volume = {92},
  number = {1},
  pages = {63-67}
}
Long, B.R. Information is Intrinsically Semantic but Alethically Neutral 2013 In Review
Vol. 191(14), pp. 3447-3467 
article  
Abstract: In this paper I argue that, according to a particular physicalist conception of information, information is both alethically neutral or non-alethic, and is intrinsically semantic. The conception of information presented is physicalist and reductionist, and is contrary to most current pluralist and non-reductionist philosophical opinion about the nature of information. The ontology assumed for this conception of information is based upon physicalist non-eliminative ontic structural realism. However, the argument of primary interest is that information so construed is intrinsically semantic on a reductionist and non-alethic basis where semantic content is constituted by indication along causal pathways. Similar arguments have been presented by philosophers with respect to representation. I suggest the conception of information that I present is correct by the lights of the best applied mathematical and scientific theories of information. If so, there is no need for any separate theory of semantic information. Thus I present a theory of intrinsically semantic information which also constitutes an informational theory of truth where truth reduces to information. In the last section I discuss weakly and strongly semantic information, and reject them in favour of alethically neutral intrinsically semantic information.;In this paper I argue that, according to a particular physicalist conception of information, information is both alethically neutral or non-alethic, and is intrinsically semantic. The conception of information presented is physicalist and reductionist, and is contrary to most current pluralist and non-reductionist philosophical opinion about the nature of information. The ontology assumed for this conception of information is based upon physicalist non-eliminative ontic structural realism. However, the argument of primary interest is that information so construed is intrinsically semantic on a reductionist and non-alethic basis where semantic content is constituted by indication along causal pathways. Similar arguments have been presented by philosophers with respect to representation. I suggest the conception of information that I present is correct by the lights of the best applied mathematical and scientific theories of information. If so, there is no need for any separate theory of semantic information. Thus I present a theory of intrinsically semantic information which also constitutes an informational theory of truth where truth reduces to information. In the last section I discuss weakly and strongly semantic information, and reject them in favour of alethically neutral intrinsically semantic information.;In this paper I argue that, according to a particular physicalist conception of information, information is both alethically neutral or non-alethic, and is intrinsically semantic. The conception of information presented is physicalist and reductionist, and is contrary to most current pluralist and non-reductionist philosophical opinion about the nature of information. The ontology assumed for this conception of information is based upon physicalist non-eliminative ontic structural realism. However, the argument of primary interest is that information so construed is intrinsically semantic on a reductionist and non-alethic basis where semantic content is constituted by indication along causal pathways. Similar arguments have been presented by philosophers with respect to representation. I suggest the conception of information that I present is correct by the lights of the best applied mathematical and scientific theories of information. If so, there is no need for any separate theory of semantic information. Thus I present a theory of intrinsically semantic information which also constitutes an informational theory of truth where truth reduces to information. In the last section I discuss weakly and strongly semantic information, and reject them in favour of alethically neutral intrinsically semantic information.; In this paper I argue that, according to a particular physicalist conception of information, information is both alethically neutral or non-alethic, and is intrinsically semantic. The conception of information presented is physicalist and reductionist, and is contrary to most current pluralist and non-reductionist philosophical opinion about the nature of information. The ontology assumed for this conception of information is based upon physicalist non-eliminative ontic structural realism. However, the argument of primary interest is that information so construed is intrinsically semantic on a reductionist and non-alethic basis where semantic content is constituted by indication along causal pathways. Similar arguments have been presented by philosophers with respect to representation. I suggest the conception of information that I present is correct by the lights of the best applied mathematical and scientific theories of information. If so, there is no need for any separate theory of semantic information. Thus I present a theory of intrinsically semantic information which also constitutes an informational theory of truth where truth reduces to information. In the last section I discuss weakly and strongly semantic information, and reject them in favour of alethically neutral intrinsically semantic information.[PUBLICATION ABSTRACT];
BibTeX:
@article{Long2013,
  author = {Long, B. R.},
  title = {Information is Intrinsically Semantic but Alethically Neutral},
  journal = {In Review},
  year = {2013},
  volume = {191},
  number = {14},
  pages = {3447--3467}
}
Landauer, R. Information is physical 1991 Physics Today
Vol. 44(5), pp. 23-29 
article URL 
BibTeX:
@article{landauer_information_1991,
  author = {Landauer, Rolf},
  title = {Information is physical},
  journal = {Physics Today},
  year = {1991},
  volume = {44},
  number = {5},
  pages = {23--29},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3Z07a8MwEMdFoRQKpfSd9AEeugUXy5YieegQ0oYE-gI7QyejJ3RoGogL_fiVLNlxTIfMXW3Z2Dr79D9x9zsAkvguCjs-gXKoBeERh0OeEsgjgZNIYymHhBPHfm7tbDfsz_Wx_2D42YsJ7J5dN51ZNnibvmeWedBWoe5YNshfH0aNlLWtM0Zznxix3gxwSUt4I7HCshRM1J_7m2dt15fAMI19JbZy3i5F5vKh61BUu0OHY_Rmx23flrRWSb9NscGv7qwrTbafJevCyROx6arWNBZq_ik_RHmvFuE8M-unrVOspWxd0xrFrtuhf2yLchVfy-_VnwqhUgP5ETj0Mj4Yuek_BjtqcQL2qnRasToFFy0jBLMsqI1wBuaTx3w8DX0LilDABKMwlTpSXHAUq4RQwYZSKQY5VxGDMNURYloT8xsobeJUxk2srAUWEBPEoTRSLD4HB8yWKizKqqRR9kAgGeXMKAaaYo6kQBSlwoxV1Ghzxijpg55702LpgCOFhbzZWJP2wW3nVFys4sISXiu0o1GfRflTmmHtKWoGdwzRB3CbYWMPi7eQhPJyu1tfgf3153kNdrX5pdSNQ13-AsTnQRc}
}
Hawking, S.W. Information loss in black holes 2005 Physical Review D - Particles, Fields, Gravitation and Cosmology
Vol. 72(8) 
article URL 
Abstract: The question of whether information is lost in black holes is investigated using Euclidean path integrals. The formation and evaporation of black holes is regarded as a scattering problem with all measurements being made at infinity. This seems to be well formulated only in asymptotically AdS spacetimes. The path integral over metrics with trivial topology is unitary and information preserving. On the other hand, the path integral over metrics with nontrivial topologies leads to correlation functions that decay to zero. Thus at late times only the unitary information preserving path integrals over trivial topologies will contribute. Elementary quantum gravity interactions do not lose information or quantum coherence.
BibTeX:
@article{hawking_information_2005,
  author = {Hawking, S. W.},
  title = {Information loss in black holes},
  journal = {Physical Review D - Particles, Fields, Gravitation and Cosmology},
  year = {2005},
  volume = {72},
  number = {8},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8QwEB5EFATx_agP7EFP2jVNapMeZXXx4kUUvIUmTUCQRbZd8ec708fqrntYz0mTTlIy3zTzfQMgeI9FM2cCHno3qpCIHqzLuElsJl0mTSycT1Rs0-k_23A1_0I_ZuKaEiOf3OddT-I0iqIDYq6nKVUvEI-v3SmMqKSuukgIPJIYU3SEmblDTDmlFeKDjMu5nqj2OoNN6JLuu2yTyRX0D0n-bzb2QtZswUYLRcPb5tvZhiU33IHVOiXUlrtw1jKVaOfCd3zt8G0YGvrfF1JN3XIPXgb3z_2HqC2nEOUYNYnIcp4lnBepYyzPPVepFMaTOKFn3qfKIDgUzBhvlVOGWS9lbknthag8KQLBfVjPKe1-WNX0vOIQQssFhkzo5PHpxCC6kNYVyuDwCAMy6QO47BZWfzTyGboOO5jQne1act3YHsBFs_aTvlyX2Kg5Rj0Yt5Kypa6-qgAOZvolgmIfIQM4_71pk_Za1weRDZGSWIITxYt067cy6SQPUB39y5RjWOtEXll8AsvVaOxOG6nHbxJR5M4}
}
Frolov, V.P. Information loss problem and a ‘black hole’ model with a closed apparent horizon 2014 Journal of High Energy Physics
Vol. 2014(5), pp. 1-21 
article  
BibTeX:
@article{frolov_information_2014,
  author = {Frolov, Valeri P.},
  title = {Information loss problem and a ‘black hole’ model with a closed apparent horizon},
  journal = {Journal of High Energy Physics},
  year = {2014},
  volume = {2014},
  number = {5},
  pages = {1--21}
}
Tononi, G. Information measures for conscious experience 2001 Arch Ital Biol
Vol. 139 
article  
BibTeX:
@article{tononi_information_2001,
  author = {Tononi, G.},
  title = {Information measures for conscious experience},
  journal = {Arch Ital Biol},
  year = {2001},
  volume = {139}
}
Hawking, S.W. Information Preservation and Weather Forecasting for Black Holes 2014
Vol. arXiv:1401.5761(arXiv:1401.5761 [hep-th]) 
article URL 
Abstract: It has been suggested [1] that the resolution of the information paradox for evaporating black holes is that the holes are surrounded by firewalls, bolts of outgoing radiation that would destroy any infalling observer. Such firewalls would break the CPT invariance of quantum gravity and seem to be ruled out on other grounds. A different resolution of the paradox is proposed, namely that gravitational collapse produces apparent horizons but no event horizons behind which information is lost. This proposal is supported by ADS-CFT and is the only resolution of the paradox compatible with CPT. The collapse to form a black hole will in general be chaotic and the dual CFT on the boundary of ADS will be turbulent. Thus, like weather forecasting on Earth, information will effectively be lost, although there would be no loss of unitarity.
BibTeX:
@article{hawking_information_2014,
  author = {Hawking, S. W.},
  title = {Information Preservation and Weather Forecasting for Black Holes},
  year = {2014},
  volume = {arXiv:1401.5761},
  number = {arXiv:1401.5761 [hep-th]},
  url = {http://arxiv.org/abs/1401.5761}
}
Floridi, L. Information Quality 2013 Philosophy & Technology
Vol. 26(1), pp. 1-6 
article URL 
BibTeX:
@article{floridi_information_2013,
  author = {Floridi, Luciano},
  title = {Information Quality},
  journal = {Philosophy & Technology},
  year = {2013},
  volume = {26},
  number = {1},
  pages = {1--6},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV07T8MwED6hTmXgUV7hIXVgAqXCdhInI0JUjIjXaiXxeaEKVVWG_vveJXFDWwbYIvni6HzOPXx3nwGUHN2FGzohU-QZI0OZZIUjoxfHGkkTYpQ4RksR6yfbIFcnGdXnyCcoa73dtb4pFXHRJFcDUUTMeJ9kqrio7-X1Y3XKQlsuUfW9rZJimzCOlPKpzd8mWTNOmyp6K1dam6DxPvjWGF96sspHdx3z26XZ_2DtAPZa73R432ynQ9jBagC7PzALB9B_9pcfLI5g0HYzsXSHDRzH4hjex49vD09he81CWFIsJELNEH8xBaRWaGdRW6lt4iInyqQscnJXUiVIbpi5JEslptZFKIUtSpdLVZJ-OoFe9VXhGQx1XOqsUBgjYoT0WGTCpixjVyih8gBu_PqaaYOmYTrcZGbbENuG2TYqgFOWgOE_bT7LS0PjFHxlcUojXijGTiaGgkatye8QIoBbv6bdB-p5JU_cLh_XtZmpdQFcb1HXdO0riRFMev43sgvoy1poLLdL6M1n33jVoDsuATAy2nw}
}
Long Information Science Fiction Theory and Informationist Science Fiction 2009 School: The University of Sydney  mastersthesis  
BibTeX:
@mastersthesis{Long2009,
  author = {Long},
  title = {Information Science Fiction Theory and Informationist Science Fiction},
  school = {The University of Sydney},
  year = {2009}
}
Moles, A.A. Information theory and esthetic perception 1966   book  
BibTeX:
@book{moles_information_1966,
  author = {Moles, Abraham A.},
  title = {Information theory and esthetic perception},
  publisher = {University of Illinois Press},
  year = {1966}
}
Tribus, M. Information Theory and Thermodynamics 1963 Heat Transfer, Thermodynamics, and Education: Boelter Anniversary Volume, pp. 348-68  incollection  
Abstract: LCCN: 63022596
BibTeX:
@incollection{tribus_information_1963,
  author = {Tribus, Myron},
  title = {Information Theory and Thermodynamics},
  booktitle = {Heat Transfer, Thermodynamics, and Education: Boelter Anniversary Volume},
  publisher = {McGraw Hill},
  year = {1963},
  pages = {348--68}
}
Gatenby, R.A. and Frieden, B.R. Information Theory in Living Systems, Methods, Applications, and Challenges 2007 Bulletin of mathematical biology
Vol. 69(2), pp. 635-657 
article  
BibTeX:
@article{gatenby_information_2007,
  author = {Gatenby, Robert A. and Frieden, B. R.},
  title = {Information Theory in Living Systems, Methods, Applications, and Challenges},
  journal = {Bulletin of mathematical biology},
  year = {2007},
  volume = {69},
  number = {2},
  pages = {635--657}
}
Adami, C. Information theory in molecular biology 2004 Physics of Life Reviews
Vol. 1(1), pp. 3-22 
article  
Abstract: This article introduces the physics of information in the context of molecular biology and genomics. Entropy and information, the two central concepts of Shannon's theory of information and communication, are often confused with each other but play transparent roles when applied to statistical ensembles (i.e., identically prepared sets) of symbolic sequences. Such an approach can distinguish between entropy and information in genes, predict the secondary structure of ribozymes, and detect the covariation between residues in folded proteins. We also review applications to molecular sequence and structure analysis, and introduce new tools in the characterization of resistance mutations, and in drug design. © 2004 Elsevier B.V. All rights reserved.;This article introduces the physics of information in the context of molecular biology and genomics. Entropy and information, the two central concepts of Shannon's theory of information and communication, are often confused with each other but play transparent roles when applied to statistical ensembles (i.e., identically prepared sets) of symbolic sequences. Such an approach can distinguish between entropy and information in genes, predict the secondary structure of ribozymes, and detect the covariation between residues in folded proteins. We also review applications to molecular sequence and structure analysis, and introduce new tools in the characterization of resistance mutations, and in drug design.;This article introduces the physics of information in the context of molecular biology and genomics. Entropy and information, the two central concepts of Shannon's theory of information and communication, are often confused with each other but play transparent roles when applied to statistical ensembles (i.e., identically prepared sets) of symbolic sequences. Such an approach can distinguish between entropy and information in genes, predict the secondary structure of ribozymes, and detect the covariation between residues in folded proteins. We also review applications to molecular sequence and structure analysis, and introduce new tools in the characterization of resistance mutations, and in drug design. (c) 2004 Elsevier B.V. All rights reserved.;
BibTeX:
@article{adami_information_2004,
  author = {Adami, Christoph},
  title = {Information theory in molecular biology},
  journal = {Physics of Life Reviews},
  year = {2004},
  volume = {1},
  number = {1},
  pages = {3--22}
}
Doyle, L.R., McCowan, B., Johnston, S. and Hanser, S.F. Information theory, animal communication, and the search for extraterrestrial intelligence 2011 Acta Astronautica
Vol. 68(3), pp. 406-417 
article  
BibTeX:
@article{doyle_information_2011,
  author = {Doyle, Laurance R. and McCowan, Brenda and Johnston, Simon and Hanser, Sean F.},
  title = {Information theory, animal communication, and the search for extraterrestrial intelligence},
  journal = {Acta Astronautica},
  year = {2011},
  volume = {68},
  number = {3},
  pages = {406--417}
}
Bavaud, F. Information theory, relative entropy and statistics 2009
Vol. 5363, pp. 54-78 
inproceedings  
BibTeX:
@inproceedings{bavaud_information_2009,
  author = {Bavaud, Francois},
  title = {Information theory, relative entropy and statistics},
  year = {2009},
  volume = {5363},
  pages = {54--78}
}
Godfrey-Smith, P. Information, Arbitrariness, and Selection: Comments on Maynard Smith 2000 Philosophy of Science
Vol. 67(2), pp. 202-207 
article  
BibTeX:
@article{godfrey-smith_information_2000,
  author = {Godfrey-Smith, Peter},
  title = {Information, Arbitrariness, and Selection: Comments on Maynard Smith},
  journal = {Philosophy of Science},
  year = {2000},
  volume = {67},
  number = {2},
  pages = {202--207}
}
Hodgson, G.M. and Knudsen, T. Information, complexity and generative replication 2008 Biology & Philosophy
Vol. 23(1), pp. 47-65 
article URL 
Abstract: The established definition of replication in terms of the conditions of causality, similarity and information transfer is very broad. We draw inspiration from the literature on self-reproducing automata to strengthen the notion of information transfer in replication processes. To the triple conditions of causality, similarity and information transfer, we add a fourth condition that defines a “generative replicator” as a conditional generative mechanism, which can turn input signals from an environment into developmental instructions. Generative replication must have the potential to enhance complexity, which in turn requires that developmental instructions are part of the information that is transmitted in replication. Demonstrating the usefulness of the generative replicator concept in the social domain, we identify social generative replicators that satisfy all of the four proposed conditions.; The established definition of replication in terms of the conditions of causality, similarity and information transfer is very broad. We draw inspiration from the literature on self-reproducing automata to strengthen the notion of information transfer in replication processes. To the triple conditions of causality, similarity and information transfer, we add a fourth condition that defines a "generative replicator" as a conditional generative mechanism, which can turn input signals from an environment into developmental instructions. Generative replication must have the potential to enhance complexity, which in turn requires that developmental instructions are part of the information that is transmitted in replication. Demonstrating the usefulness of the generative replicator concept in the social domain, we identify social generative replicators that satisfy all of the four proposed conditions.
BibTeX:
@article{hodgson_information_2008,
  author = {Hodgson, Geoffrey M. and Knudsen, Thorbjørn},
  title = {Information, complexity and generative replication},
  journal = {Biology & Philosophy},
  year = {2008},
  volume = {23},
  number = {1},
  pages = {47--65},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlR3LbtQwcIRAoEoV0AJpCkg5wAU1UWxvEvuIFqpKcEC8xM2yY5sD1artplL37_HYcZouPZRbLE0SxzOZ9wOA0aout3gC76wWgmraM2NE6-jCtd0CWaRrbBOaf84820AnT8bqT5UClIFvz0rfGiZKdLR5846VG8-FvahCGv_67efEi1Hax-7eomS87VJc87Yn3JRMM3UT60Mu17NQ6a1CKgik4yeQCmVSIsoUnb6un_83Ufs_PvQpPB511eJ9JK49uGdX-_AwTq_c7MPOlzQGYfMM6FjXhHg-KkKiur3yGn6hVqb4HXpbI2MtLuwUMH8OP44_fl-elOM8hrInXksqWW0osWrhjCZU9V53sA03RhsnhO5bI6hfELtQhGjPSbjDMR7OCGf7Vte9Yy9gV2He_moI9X0mgwfO_2Q2Q8GX-RPM4NEv8fkDP_m0jMu9tKzWoQitOh8yj9vwj5Zt1R1A4bwGYhRpueLWUxTTWjHmjbjWGtUq7nJ4l9Aqz2IHD3ndqxkPV-IlHq7c5JAlxEtzeuoNI69PUjQbc3gb6WB6CJVrKmvJ0ML1VhnnjRyuhhwOtuAwdItmY5fDUULxbCe4AXQYyxGbcSdnxuEbt8HD0NDxHiZJgM3hzZwaJ9g62I_Y9xWVEg9G7gK2HFvDY0uE4fCOW3gJOzGPBl1Tr-D-cHFpX8fGln8BjQMrmg}
}
Wheeler, J. Information, physics, quantum: the search for links 1989 Proceedings III International Symposium on Foundations of Quantum Mechanics, pp. 354-368  inproceedings  
BibTeX:
@inproceedings{wheeler_information_1989,
  author = {Wheeler, John},
  title = {Information, physics, quantum: the search for links},
  booktitle = {Proceedings III International Symposium on Foundations of Quantum Mechanics},
  year = {1989},
  pages = {354--368}
}
Floridi, L. Information, possible worlds and the cooptation of scepticism 2010 Synthese
Vol. 175(S1), pp. 63-88 
article URL 
Abstract: The article investigates the sceptical challenge from an informationtheoretic perspective. Its main goal is to articulate and defend the view that either informational scepticism is radical, but then it is epistemologically innocuous because redundant; or it is moderate, but then epistemologically beneficial because useful. In order to pursue this cooptation strategy, the article is divided into seven sections. Section 1 sets up the problem. Section 2 introduces Borei numbers as a convenient way to refer uniformly to (the data that individuate) different possible worlds. Section 3 adopts the Hamming distance between Borei numbers as a metric to calculate the distance between possible worlds. In Sects. 4 and 5, radical and moderate informational scepticism are analysed using Borei numbers and Hamming distances, and shown to be either harmless (extreme form) or actually fruitful (moderate form). Section 6 further clarifies the approach by replying to some potential objections. In the conclusion, the Peircean nature of the overall approach is briefly discussed.; The article investigates the sceptical challenge from an information-theoretic perspective. Its main goal is to articulate and defend the view that either informational scepticism is radical, but then it is epistemologically innocuous because redundant; or it is moderate, but then epistemologically beneficial because useful. In order to pursue this cooptation strategy, the article is divided into seven sections. Section 1 sets up the problem. Section 2 introduces Borel numbers as a convenient way to refer uniformly to (the data that individuate) different possible worlds. Section 3 adopts the Hamming distance between Borel numbers as a metric to calculate the distance between possible worlds. In Sects. 4 and 5, radical and moderate informational scepticism are analysed using Borel numbers and Hamming distances, and shown to be either harmless (extreme form) or actually fruitful (moderate form). Section 6 further clarifies the approach by replying to some potential objections. In the conclusion, the Peircean nature of the overall approach is briefly discussed.; The article investigates the sceptical challenge from an information-theoretic perspective. Its main goal is to articulate and defend the view that either informational scepticism is radical, but then it is epistemologically innocuous because redundant; or it is moderate, but then epistemologically beneficial because useful. In order to pursue this cooptation strategy, the article is divided into seven sections. Section 1 sets up the problem. Section 2 introduces Borel numbers as a convenient way to refer uniformly to (the data that individuate) different possible worlds. Section 3 adopts the Hamming distance between Borel numbers as a metric to calculate the distance between possible worlds. In Sects. 4 and 5, radical and moderate informational scepticism are analysed using Borel numbers and Hamming distances, and shown to be either harmless (extreme form) or actually fruitful (moderate form). Section 6 further clarifies the approach by replying to some potential objections. In the conclusion, the Peircean nature of the overall approach is briefly discussed.; The article investigates the sceptical challenge from an information-theoretic perspective. Its main goal is to articulate and defend the view that either informational scepticism is radical, but then it is epistemologically innocuous because redundant or it is moderate, but then epistemologically beneficial because useful. In order to pursue this cooptation strategy, the article is divided into seven sections. Section 1 sets up the problem. Section 2 introduces Borel numbers as a convenient way to refer uniformly to (the data that individuate) different possible worlds. Section 3 adopts the Hamming distance between Borel numbers as a metric to calculate the distance between possible worlds. In Sects. 4 and 5, radical and moderate informational scepticism are analysed using Borel numbers and Hamming distances, and shown to be either harmless (extreme form) or actually fruitful (moderate form). Section 6 further clarifies the approach by replying to some potential objections. In the conclusion, the Peircean nature of the overall approach is briefly discussed.; Issue Title: Special issue on The Nature and Scope of Information The article investigates the sceptical challenge from an information-theoretic perspective. Its main goal is to articulate and defend the view that either informational scepticism is radical, but then it is epistemologically innocuous because redundant; or it is moderate, but then epistemologically beneficial because useful. In order to pursue this cooptation strategy, the article is divided into seven sections. Section 1 sets up the problem. Section 2 introduces Borel numbers as a convenient way to refer uniformly to (the data that individuate) different possible worlds. Section 3 adopts the Hamming distance between Borel numbers as a metric to calculate the distance between possible worlds. In Sects. 4 and 5, radical and moderate informational scepticism are analysed using Borel numbers and Hamming distances, and shown to be either harmless (extreme form) or actually fruitful (moderate form). Section 6 further clarifies the approach by replying to some potential objections. In the conclusion, the Peircean nature of the overall approach is briefly discussed.[PUBLICATION ABSTRACT]
BibTeX:
@article{floridi_information_2010,
  author = {Floridi, Luciano},
  title = {Information, possible worlds and the cooptation of scepticism},
  journal = {Synthese},
  year = {2010},
  volume = {175},
  number = {S1},
  pages = {63--88},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fi9QwEB5EQQ5E3dPr1R-QBxWV65qk3bZ5Elk9DvTBBwXfQpukcLDu9q498P77m0na7nqeoPvWZbKbJpPMl8w3MwCpnPPk2p5Q1zxzwjpEu1WKVr6SsrJ5hbZkoarC-913branmwwiWXqWoPfpI1yqV-5dhiAH2xXv27OEqkeRl3UopYFbMaUoC6XMJ2cCngiG7IwqKUoyRDsmaNiIAxnxd5hJcSEX3Y6L9Ebj5A3R8QMYmW8jAWXySm_j5m8gaP_nCz6E-wNQZR-CZs3gllvvw97XsfLB5T7Mhn2hY6-H5NVvHsEY30TzfcTaDS25lWM-MWvHqrVliDiZ2WzawAFgm4Z1nlpjTrufj-H78advy5NkKNGQGCrumdS8zhaizmzupDSyFHllM4PfGWe5cEI1RrhSNYijZFmXeWm4q22R17kUxiFWO4B7FVH5170P-bMR3Glw3bmIbGGEgxvB3R_qy8fy5PMyPM7Gx3nn49LmZ32EM-2HJ8nnxSEwgSY2a8ijVJssTRuFMNMZrgpuGuUyEcPbUQF0G5J66G36Zuyq0nhM1YoobzyGiFRE04LvzyujZYqnxBQtSQwHfu6m3xgnDpuMaqTtaqVloXihEC3gP78MWjW1kbqTmuuioAjjVKp8oftffQyH1-RwoWQIHjn252jUx23nfZ-J2KAHtQidb20Tw6s_xL1gaIPoUQsvHMOLXeWehP0FrKQYa_osYhD_IrYcMsxTZoX-yd8G6insBTYGESqfwe3-_MI9DwkxrwCZXUR5}
}
Haught, J.F. Information, theology, and the universe 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 301-318  incollection  
BibTeX:
@incollection{haught_information_2010,
  author = {Haught, John F.},
  title = {Information, theology, and the universe},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {301--318},
  note = {DOI: 10.1017/CBO9780511778759.014}
}
Floridi, L. Information: a very short introduction 2010
Vol. 225. 
book URL 
BibTeX:
@book{floridi_information:_2010,
  author = {Floridi, Luciano},
  title = {Information: a very short introduction},
  publisher = {Oxford University Press},
  year = {2010},
  volume = {225.},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV07D4IwEL4ILjgpSgQlYdQBg4U-mI3E0cGdUNOOLk7-e6-2JGJwbJpc-rpn830HUJJDkf_YBI05G14-lUJqeueqYAWvdce0kOooFBlWtsfyxgEAva9gYOyM_op54GGS96WVltiboIWuLI1ObfqYcOoYd_qxGJDvfTxKMwffoAwWMFGPEIJr31DgFUJsEbOZ07pntnPU0PslzBx6yJzmCtLmfDtdciO7dWWYVrrFliQCH1N7tYaMsrLjTGqiKK8kkYJhRCQwJqiFCd1oDNG4jOTfxAYC-79tigRbmGp8xCq123wD4XBqhg}
}
Jablonka, E. Information: Its Interpretation, Its Inheritance, and Its Sharing 2002 Philosophy of Science
Vol. 69(4), pp. 578-605 
article URL 
Abstract: The semantic concept of information is one of the most important, and one of the most problematical concepts in biology. I suggest a broad definition of biological information: a source becomes an informational input when an interpreting receiver can react to the form of the source (and variations in this form) in a functional manner. The definition accommodates information stemming from environmental cues as well as from evolved signals, and calls for a comparison between information-transmission in different types of inheritance systems-the genetic, the epigenetic, the behavioral, and the cultural-symbolic. This comparative perspective highlights the different ways in which information is acquired and transmitted, and the role that such information plays in heredity and evolution. Focusing on the special properties of the transfer of information, which are very different from those associated with the transfer of materials or energy, also helps to uncover interesting evolutionary effects and suggests better explanations for some aspects of the evolution of communication.; The semantic concept of information is one of the most important, and one of the most problematical concepts in biology. I suggest a broad definition of biological information: a source becomes an informational input when an interpreting receiver can react to the form of the source (and variations in this form) in a functional manner. The definition accommodates information stemming from environmental cues as well as from evolved signals, and calls for a comparison between information‐transmission in different types of inheritance systems—the genetic, the epigenetic, the behavioral, and the cultural‐symbolic. This comparative perspective highlights the different ways in which information is acquired and transmitted, and the role that such information plays in heredity and evolution. Focusing on the special properties of the transfer of information, which are very different from those associated with the transfer of materials or energy, also helps to uncover interesting evolutionary effects and suggests better explanations for some aspects of the evolution of communication.; The semantic concept of information is one of the most important, and one of the most problematical concepts in biology. I suggest a broad definition of biological information: a source becomes an informational input when an interpreting receiver can react to the form of the source (and variations in this form) in a functional manner. The definition accommodates information stemming from environmental cues as well as from evolved signals, and calls for a comparison between information-transmission in different types of inheritance systems-the genetic, the epigenetic, the behavioral, and the cultural-symbolic. This comparative perspective highlights the different ways in which information is acquired and transmitted, and the role that such information plays in heredity and evolution. Focusing on the special properties of the transfer of information, which are very different from those associated with the transfer of materials or energy, also helps to uncover interesting evolutionary effects and suggests better explanations for some aspects of the evolution of communication.; The semantic concept of information is one of the most important, and one of the most problematical concepts in biology. I suggest a broad definition of biological information: a source becomes an informational input when an interpreting receiver can react to the form of the source (and variations in this form) in a functional manner. The definition accommodates information stemming from environmental cues as well as from evolved signals, and calls for a comparison between information-transmission in different types of inheritance systems-the genetic, the epigenetic, the behavioral, and the cultural-symbolic. This comparative perspective highlights the different ways in which information is acquired and transmitted, and the role that such information plays in heredity and evolution. Focusing on the special properties of the transfer of information, which are very different from those associated with the transfer of materials or energy, also helps to uncover interesting evolutionary effects and suggests better explanations for some aspects of the evolution of communication.
BibTeX:
@article{jablonka_information:_2002,
  author = {Jablonka, Eva},
  title = {Information: Its Interpretation, Its Inheritance, and Its Sharing},
  journal = {Philosophy of Science},
  year = {2002},
  volume = {69},
  number = {4},
  pages = {578--605},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1NS8QwEB3EkyDqror1A3vwoGA1Sbdt6kVkUfTmQc8hbZPTsqitqP_emTTtfqAgXrbQbbqbdHbmZWfmPYBYXLBoySfYTKaS2Vxok-p0ZIzJiPikwnhjqtLRds_9sz0r1fn4uHBVgi6nj3CpmJjLTh4oxp2M4NcvrxFpSFGu1QtqoEPmgrVsueM-pcAla91yzCMpSO9nLhB5d9yWJC6CTeoOea-XxUl-jFMuJt1tQlfg1NWi9AnqWQv9D7Xa_5rrFmx45BretKY2gBUzHcLaYyeF8DWEgXcUdXjq2azPtuHGNzyRAVyFD00dLtY5nvtz1IfYkAmeh3pauZNEJo2RdQee726fxveR122ISgyEaYSIi1OHq5XaFrif0rEsmWUWN084D6YRQSCItCPLEdoLm2VFluR4YCbJMokxdRfWNdX3TxvXB1jtQVhVeCuOL7rELanJC1wWXqBZaSELzosAjrunqV5ang7l8usyVe3iBbBLD1nRD7d506XKk0RQEjSAHbfu_bgYUTTiZhzQmYGqJhMliHoxTSUO2GuNYvZJcZKRkJQI4GTpLaFqoZhiI2pZRmwsVPPZBHC0YE3KO466_64n89bV38zRG-WUKnc4IwD-l8vGnu2dWA6a_V9mewBrndAN44ew2ry9m6OWm_IbD-UlVA}
}
Calude, C.S. Information: the algorithmic paradigm 2009
Vol. 5363, pp. 79-94 
inproceedings  
BibTeX:
@inproceedings{calude_information:_2009,
  author = {Calude, Cristian S.},
  title = {Information: the algorithmic paradigm},
  year = {2009},
  volume = {5363},
  pages = {79--94}
}
Gillies, D. Informational Realism and World 3 2010 Knowledge Technology and Policy
Vol. 23, pp. 7-14 
article  
BibTeX:
@article{Gillies2010,
  author = {Gillies, D.},
  title = {Informational Realism and World 3},
  journal = {Knowledge Technology and Policy},
  year = {2010},
  volume = {23},
  pages = {7-14}
}
Knuth, K.H. Information-based physics and the influence network 2015 It From Bit or Bit From It?, pp. 65-78  incollection  
BibTeX:
@incollection{knuth_information-based_2015,
  author = {Knuth, Kevin H},
  title = {Information-based physics and the influence network},
  booktitle = {It From Bit or Bit From It?},
  publisher = {Springer},
  year = {2015},
  pages = {65--78}
}
Roy Frieden, B. Information-based uncertainty for a photon 2007 Optics Communications
Vol. 271(1), pp. 71-72 
article DOI URL 
Abstract: It is shown on the basis of Fisher information that the ultimate root-mean square uncertainty in the position of a single photon of wavelength I' is 0.112I' in vacuum. This is as well an "effective size" for the photon.; It is shown on the basis of Fisher information that the ultimate root-mean square uncertainty in the position of a single photon of wavelength lambda is 0.112 lambda in vacuum. This is as well an "effective size" for the photon. (c) 2006 Elsevier B.V. All rights reserved.; It is shown on the basis of Fisher information that the ultimate root-mean square uncertainty in the position of a single photon of wavelength λ is 0.112λ in vacuum. This is as well an "effective size" for the photon. © 2006 Elsevier B.V. All rights reserved.
BibTeX:
@article{roy_frieden_information-based_2007,
  author = {Roy Frieden, B.},
  title = {Information-based uncertainty for a photon},
  journal = {Optics Communications},
  year = {2007},
  volume = {271},
  number = {1},
  pages = {71--72},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1bS8MwFD6IKAji_VIv0Ad9EeqyNE3bR50Of4A-hyRtcT5sxXWg_96cJK1zTJS9NmlpcpJzcpLv-wIQ01sSLfgEncZViaGZcGWWIDFTeSE502WuVVxYWvfczrajxniQpY8EzsNb3-2f9Hzf9urRCBm_JjFHNSpuVc6Qdp6YzA-FHx_uO9-Ml307oUYSYe2WTGcRX5O6QQSJPaBAxFfa_y1Yee-9gcyR2XRpzLLxabgLLSiqxaV0h9XfdPoluO3V270HO34pG965evuwVo4PYNNCSvX0EG480wktH2GwLEITQx0CofkMTVkow_p1YhafR_AyfHwePEX-ZoZIo-B7VBGeJ7JklOhMKTOpk0zqQhY00YWuqMwrVTCtc6qVNEke1UQyLrUuJS8V6vkcw7ZEBP-4sUy_4hRCWSWk5FXKE6UY7XPJK6oq5NHitaQsCSBqbSFqp8QhWojam3C2w0s1OT41tgsgbQ0mfvSlMMHijzdP0L4Cp3jzLrUwq2LKMprzAK6dybs_oGJKBREm0WccteuTTDQfjfnCQj1TatIYngZwNT9WunJipSHzjGU4SkkA_f9UG3gdd9QvaM5WbvA5bLkta9xZuoD15n1WXjpdyi-4YCPN},
  doi = {http://doi.org/10.1016/j.optcom.2006.10.071}
}
Long, B.R. Informationist Science Fiction and Informationist Science Fiction Theory 2009 School: The University of Sydney  phdthesis URL 
BibTeX:
@phdthesis{long_informationist_2009,
  author = {Long, Bruce R.},
  title = {Informationist Science Fiction and Informationist Science Fiction Theory},
  school = {The University of Sydney},
  year = {2009},
  url = {http://ses.library.usyd.edu.au/bitstream/2123/5838/1/InformationistScienceFictionTheoryand%20InformationistScienceFictionBruceRLongePublish.pdf}
}
Adami, C. Information-Theoretic Considerations Concerning the Origin of Life 2015 Origins of Life and Evolution of Biospheres
Vol. 45(3), pp. 309-317 
article  
Abstract: Research investigating the origins of life usually either focuses on exploring possible life-bearing chemistries in the pre-biotic Earth, or else on synthetic approaches. Comparatively little work has explored fundamental issues concerning the spontaneous emergence of life using only concepts (such as information and evolution) that are divorced from any particular chemistry. Here, I advocate studying the probability of spontaneous molecular self-replication as a function of the information contained in the replicator, and the environmental conditions that might enable this emergence. I show (under certain simplifying assumptions) that the probability to discover a self-replicator by chance depends exponentially on the relative rate of formation of the monomers. If the rate at which monomers are formed is somewhat similar to the rate at which they would occur in a self-replicating polymer, the likelihood to discover such a replicator by chance is increased by many orders of magnitude. I document such an increase in searches for a self-replicator within the digital life system avida.;Research investigating the origins of life usually focuses on exploring possible life-bearing chemistries in the pre-biotic Earth, or else on synthetic approaches. Little work has been done exploring fundamental issues concerning the spontaneous emergence of life using only concepts (such as information and evolution) that are divorced from any particular chemistry. Here, I advocate studying the probability of spontaneous molecular self-replication as a function of the information contained in the replicator, and the environmental conditions that might enable this emergence. I show that (under certain simplifying assumptions) the probability to discover a self-replicator by chance depends exponentially on the rate of formation of the monomers. If the rate at which monomers are formed is somewhat similar to the rate at which they would occur in a self-replicating polymer, the likelihood to discover such a replicator by chance is increased by many orders of magnitude. I document such an increase in searches for a self-replicator within the digital life system avida; Research investigating the origins of life usually either focuses on exploring possible life-bearing chemistries in the pre-biotic Earth, or else on synthetic approaches. Comparatively little work has explored fundamental issues concerning the spontaneous emergence of life using only concepts (such as information and evolution) that are divorced from any particular chemistry. Here, I advocate studying the probability of spontaneous molecular self-replication as a function of the information contained in the replicator, and the environmental conditions that might enable this emergence. I show (under certain simplifying assumptions) that the probability to discover a self-replicator by chance depends exponentially on the relative rate of formation of the monomers. If the rate at which monomers are formed is somewhat similar to the rate at which they would occur in a self-replicating polymer, the likelihood to discover such a replicator by chance is increased by many orders of magnitude. I document such an increase in searches for a self-replicator within the digital life system avida.;Research investigating the origins of life usually either focuses on exploring possible life-bearing chemistries in the pre-biotic Earth, or else on synthetic approaches. Comparatively little work has explored fundamental issues concerning the spontaneous emergence of life using only concepts (such as information and evolution) that are divorced from any particular chemistry. Here, I advocate studying the probability of spontaneous molecular self-replication as a function of the information contained in the replicator, and the environmental conditions that might enable this emergence. I show (under certain simplifying assumptions) that the probability to discover a self-replicator by chance depends exponentially on the relative rate of formation of the monomers. If the rate at which monomers are formed is somewhat similar to the rate at which they would occur in a self-replicating polymer, the likelihood to discover such a replicator by chance is increased by many orders of magnitude. I document such an increase in searches for a self-replicator within the digital life system avida.;Research investigating the origins of life usually either focuses on exploring possible life-bearing chemistries in the pre-biotic Earth, or else on synthetic approaches. Comparatively little work has explored fundamental issues concerning the spontaneous emergence of life using only concepts (such as information and evolution) that are divorced from any particular chemistry. Here, I advocate studying the probability of spontaneous molecular self-replication as a function of the information contained in the replicator, and the environmental conditions that might enable this emergence. I show (under certain simplifying assumptions) that the probability to discover a self-replicator by chance depends exponentially on the relative rate of formation of the monomers. If the rate at which monomers are formed is somewhat similar to the rate at which they would occur in a self-replicating polymer, the likelihood to discover such a replicator by chance is increased by many orders of magnitude. I document such an increase in searches for a self-replicator within the digital life system avida.;
BibTeX:
@article{adami_information-theoretic_2015,
  author = {Adami, Christoph},
  title = {Information-Theoretic Considerations Concerning the Origin of Life},
  journal = {Origins of Life and Evolution of Biospheres},
  year = {2015},
  volume = {45},
  number = {3},
  pages = {309--317}
}
Beavers, A.F. and Harrison, C.D. Information-theoretic teleodynamics in natural and artificial systems 2012   incollection  
BibTeX:
@incollection{beavers_information-theoretic_2012,
  author = {Beavers, Anthony F. and Harrison, Christopher D.},
  title = {Information-theoretic teleodynamics in natural and artificial systems},
  year = {2012}
}
Frank, S. Input-output relations in biological systems: measurement, information and the Hill equation 2013 BIOLOGY DIRECT
Vol. 8(1), pp. 31-31 
article URL 
Abstract: Biological systems produce outputs in response to variable inputs. Input-output relations tend to follow a few regular patterns. For example, many chemical processes follow the S-shaped Hill equation relation between input concentrations and output concentrations. That Hill equation pattern contradicts the fundamental Michaelis-Menten theory of enzyme kinetics. I use the discrepancy between the expected Michaelis-Menten process of enzyme kinetics and the widely observed Hill equation pattern of biological systems to explore the general properties of biological input-output relations. I start with the various processes that could explain the discrepancy between basic chemistry and biological pattern. I then expand the analysis to consider broader aspects that shape biological input-output relations. Key aspects include the input-output processing by component subsystems and how those components combine to determine the system's overall input-output relations. That aggregate structure often imposes strong regularity on underlying disorder. Aggregation imposes order by dissipating information as it flows through the components of a system. The dissipation of information may be evaluated by the analysis of measurement and precision, explaining why certain common scaling patterns arise so frequently in input-output relations. I discuss how aggregation, measurement and scale provide a framework for understanding the relations between pattern and process. The regularity imposed by those broader structural aspects sets the contours of variation in biology. Thus, biological design will also tend to follow those contours. Natural selection may act primarily to modulate system properties within those broad constraints. Reviewers: This article was reviewed by Eugene Koonin, Georg Luebeck and Sergei Maslov.; Biological systems produce outputs in response to variable inputs. Input-output relations tend to follow a few regular patterns. For example, many chemical processes follow the S-shaped Hill equation relation between input concentrations and output concentrations. That Hill equation pattern contradicts the fundamental Michaelis-Menten theory of enzyme kinetics. I use the discrepancy between the expected Michaelis-Menten process of enzyme kinetics and the widely observed Hill equation pattern of biological systems to explore the general properties of biological input-output relations. I start with the various processes that could explain the discrepancy between basic chemistry and biological pattern. I then expand the analysis to consider broader aspects that shape biological input-output relations. Key aspects include the input-output processing by component subsystems and how those components combine to determine the system's overall input-output relations. That aggregate structure often imposes strong regularity on underlying disorder. Aggregation imposes order by dissipating information as it flows through the components of a system. The dissipation of information may be evaluated by the analysis of measurement and precision, explaining why certain common scaling patterns arise so frequently in input-output relations. I discuss how aggregation, measurement and scale provide a framework for understanding the relations between pattern and process. The regularity imposed by those broader structural aspects sets the contours of variation in biology. Thus, biological design will also tend to follow those contours. Natural selection may act primarily to modulate system properties within those broad constraints.; Doc number: 31 Abstract: Biological systems produce outputs in response to variable inputs. Input-output relations tend to follow a few regular patterns. For example, many chemical processes follow the S-shaped Hill equation relation between input concentrations and output concentrations. That Hill equation pattern contradicts the fundamental Michaelis-Menten theory of enzyme kinetics. I use the discrepancy between the expected Michaelis-Menten process of enzyme kinetics and the widely observed Hill equation pattern of biological systems to explore the general properties of biological input-output relations. I start with the various processes that could explain the discrepancy between basic chemistry and biological pattern. I then expand the analysis to consider broader aspects that shape biological input-output relations. Key aspects include the input-output processing by component subsystems and how those components combine to determine the system's overall input-output relations. That aggregate structure often imposes strong regularity on underlying disorder. Aggregation imposes order by dissipating information as it flows through the components of a system. The dissipation of information may be evaluated by the analysis of measurement and precision, explaining why certain common scaling patterns arise so frequently in input-output relations. I discuss how aggregation, measurement and scale provide a framework for understanding the relations between pattern and process. The regularity imposed by those broader structural aspects sets the contours of variation in biology. Thus, biological design will also tend to follow those contours. Natural selection may act primarily to modulate system properties within those broad constraints. Reviewers: This article was reviewed by Eugene Koonin, Georg Luebeck and Sergei Maslov.; Biological systems produce outputs in response to variable inputs. Input-output relations tend to follow a few regular patterns. For example, many chemical processes follow the S-shaped Hill equation relation between input concentrations and output concentrations. That Hill equation pattern contradicts the fundamental Michaelis-Menten theory of enzyme kinetics. I use the discrepancy between the expected Michaelis-Menten process of enzyme kinetics and the widely observed Hill equation pattern of biological systems to explore the general properties of biological input-output relations. I start with the various processes that could explain the discrepancy between basic chemistry and biological pattern. I then expand the analysis to consider broader aspects that shape biological input-output relations. Key aspects include the input-output processing by component subsystems and how those components combine to determine the system?s overall input-output relations. That aggregate structure often imposes strong regularity on underlying disorder. Aggregation imposes order by dissipating information as it flows through the components of a system. The dissipation of information may be evaluated by the analysis of measurement and precision, explaining why certain common scaling patterns arise so frequently in input-output relations. I discuss how aggregation, measurement and scale provide a framework for understanding the relations between pattern and process. The regularity imposed by those broader structural aspects sets the contours of variation in biology. Thus, biological design will also tend to follow those contours. Natural selection may act primarily to modulate system properties within those broad constraints.; Biological systems produce outputs in response to variable inputs. Input-output relations tend to follow a few regular patterns. For example, many chemical processes follow the S-shaped Hill equation relation between input concentrations and output concentrations. That Hill equation pattern contradicts the fundamental Michaelis-Menten theory of enzyme kinetics. I use the discrepancy between the expected Michaelis-Menten process of enzyme kinetics and the widely observed Hill equation pattern of biological systems to explore the general properties of biological input-output relations. I start with the various processes that could explain the discrepancy between basic chemistry and biological pattern. I then expand the analysis to consider broader aspects that shape biological input-output relations. Key aspects include the input-output processing by component subsystems and how those components combine to determine the system?s overall input-output relations. That aggregate structure often imposes strong regularity on underlying disorder. Aggregation imposes order by dissipating information as it flows through the components of a system. The dissipation of information may be evaluated by the analysis of measurement and precision, explaining why certain common scaling patterns arise so frequently in input-output relations. I discuss how aggregation, measurement and scale provide a framework for understanding the relations between pattern and process. The regularity imposed by those broader structural aspects sets the contours of variation in biology. Thus, biological design will also tend to follow those contours. Natural selection may act primarily to modulate system properties within those broad constraints.; : Biological systems produce outputs in response to variable inputs. Input-output relations tend to follow a few regular patterns. For example, many chemical processes follow the S-shaped Hill equation relation between input concentrations and output concentrations. That Hill equation pattern contradicts the fundamental Michaelis-Menten theory of enzyme kinetics. I use the discrepancy between the expected Michaelis-Menten process of enzyme kinetics and the widely observed Hill equation pattern of biological systems to explore the general properties of biological input-output relations. I start with the various processes that could explain the discrepancy between basic chemistry and biological pattern. I then expand the analysis to consider broader aspects that shape biological input-output relations. Key aspects include the input-output processing by component subsystems and how those components combine to determine the system's overall input-output relations. That aggregate structure often imposes strong regularity on underlying disorder. Aggregation imposes order by dissipating information as it flows through the components of a system. The dissipation of information may be evaluated by the analysis of measurement and precision, explaining why certain common scaling patterns arise so frequently in input-output relations. I discuss how aggregation, measurement and scale provide a framework for understanding the relations between pattern and process. The regularity imposed by those broader structural aspects sets the contours of variation in biology. Thus, biological design will also tend to follow those contours. Natural selection may act primarily to modulate system properties within those broad constraints. REVIEWERS: This article was reviewed by Eugene Koonin, Georg Luebeck and Sergei Maslov.
BibTeX:
@article{frank_input-output_2013,
  author = {Frank, SA},
  title = {Input-output relations in biological systems: measurement, information and the Hill equation},
  journal = {BIOLOGY DIRECT},
  year = {2013},
  volume = {8},
  number = {1},
  pages = {31--31},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpZ3PS8MwFMeDioIg4m-rDnLwaLVJ27TZTcUx73oRIWRNIkKtc2sP_ve-13Z1mx4ETz28JpR8SfJN-vIJISG_DPylMQGWYNaBe5bCiEgzbbjTQo5GwlmE4JnFne1u-2Pphz5LxRU46NhHirmP-3qr4IiC-tjWzdNsAIYFR30RXfdmy2r8WXrpkHv-c1iem5fW8ZhINf11gqono8EO2W5dJL1uZN8lK7bYIxvNvZKf–T5vhhXpf9elfCgk1m-G30taANdQmVoA3Ge9unb9z7hBW1JqliA6sJQMIh0-Jrn1H40VPAD8ji4e7gd-u01Cr4WXHLfSLBARgehlU66LNZyJHkqskBIk-gI1oQ2MVYIl4ogAwcYZszxLGEZ41ZG0OUPyZbGdPuirI_lmWNCQcfEgTVkTEeRTZh0DEl4oYklT7Jk5JH-QsuqccPOUEizXoxAx1Ioi0JZVKpC5pEeyqAQT1Fg_suLrqZTBVYxZMgn9MhRG3fvUEW2EJkpp0yeQ10xx1jKMFIL2X0Jj0IYYSMJkUbZLhIjAl9G3CPn81J3cVxHIqYOzTZMKB5hf3nttmWuI2ugPPlP-5ySTY6XcmBSTXxG1spJZXsNRvILJp4GeA}
}
Edlund, J.A., Chaumont, N., Hintze, A., Koch, C., Tononi, G. and Adami, C. Integrated information increases with fitness in the evolution of animats 2011 PLoS Computational Biology
Vol. 7(10), pp. e1002236 
article  
Abstract: One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent ("animat") evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its "fit" to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data.;One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent ("animat") evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its "fit" to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data.;One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent ("animat”) evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its "fit” to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data.;One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent ("animat") evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its "fit" to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data.;
BibTeX:
@article{edlund_integrated_2011,
  author = {Edlund, Jeffrey A. and Chaumont, Nicolas and Hintze, Arend and Koch, Christof and Tononi, Giulio and Adami, Christoph},
  title = {Integrated information increases with fitness in the evolution of animats},
  journal = {PLoS Computational Biology},
  year = {2011},
  volume = {7},
  number = {10},
  pages = {e1002236}
}
Frieden, B.R. Introduction to Fisher information: Its origin, uses, and predictions 2007 , pp. 1-41  incollection  
BibTeX:
@incollection{frieden_introduction_2007,
  author = {Frieden, B. R.},
  title = {Introduction to Fisher information: Its origin, uses, and predictions},
  year = {2007},
  pages = {1--41}
}
Kawahigashi, Y., Pérez García, D. and Ruskai, M.B. Introduction to Special Issue: Operator Algebras and Quantum Information Theory 2016 Journal of Mathematical Physics
Vol. 57(1), pp. 15101 
article  
BibTeX:
@article{kawahigashi_introduction_2016,
  author = {Kawahigashi, Yasuyuki and Pérez García, David and Ruskai, Mary B.},
  title = {Introduction to Special Issue: Operator Algebras and Quantum Information Theory},
  journal = {Journal of Mathematical Physics},
  year = {2016},
  volume = {57},
  number = {1},
  pages = {15101}
}
Black, E., Floridi, L. and Third, A. Introduction to the special issue on the nature and scope of information 2010 Synthese
Vol. 175(S1), pp. 1-3 
article URL 
Abstract: Issue Title: Special issue on The Nature and Scope of Information
BibTeX:
@article{black_introduction_2010,
  author = {Black, Elizabeth and Floridi, Luciano and Third, Allan},
  title = {Introduction to the special issue on the nature and scope of information},
  journal = {Synthese},
  year = {2010},
  volume = {175},
  number = {S1},
  pages = {1--3},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV07T8MwED6hTl2AFgrhIXlCgAg473hCCFGVjQFmy47tqUoLLRL8e852ktICA6t1thJfcg_f5-8Akviahhs2wWC2bLIyyjSrJHogoVKaZEkuhDJxbMz6yXZ3kmFBlg4l6Gr6GC7Jqb5JMcix3Rpu56-h7R5lq6xNKw00xRHaYt_KvCsmYEbQsDOysCitI_rmgjwKcS2-3CiJOk8z3oEW2tYiTLqy8-pi_C8I7H–wS5sN5EoufOfzgC2dD2E_lPb2uBzCIPmx1-Q84ad-mIPJo8W26486SxZzghGkGThu9gTp0Vix3HQc4YSUSvi7r6QmSENT6uduw8v44fn-0nYtGMIK0wDo1CXOjVFYWRJLQmeZEJXIkW10iQWFa3ylEmZmxRNuTSKaSGrXJfGFDpDi8poMoJePav1IRAMGqvIZeaYjiUyl4WljVEmS1VGBc0CuGw1xOeedYOv-JWjOGYc80jOioTxjwBGboc7yXZ7AzholcrVdMrjgtGCYVIeBXDV6ni1vlvWggN4s_N-_bkyAZz9EHeCfg5GYDxywkd_Pcox9D3swCIHT6C3fHvXp5758QtuzfNq}
}
Davies, P. and Gregersen, N.H. Introduction: does information matter? 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 1-10  incollection  
BibTeX:
@incollection{davies_introduction:_2010,
  author = {Davies, Paul and Gregersen, Niels Henrik},
  title = {Introduction: does information matter?},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {1--10},
  note = {DOI: 10.1017/CBO9780511778759.001}
}
Baumgaertner, B. and Floridi, L. Introduction: The Philosophy of Information 2016 Topoi
Vol. 35(1), pp. 157-159 
article  
BibTeX:
@article{baumgaertner_introduction:_2016,
  author = {Baumgaertner, Bert and Floridi, Luciano},
  title = {Introduction: The Philosophy of Information},
  journal = {Topoi},
  year = {2016},
  volume = {35},
  number = {1},
  pages = {157--159}
}
Floridi, L. Is Semantic Information Meaningful Data? 2005 Philosophy and Phenomenological Research
Vol. 70(2), pp. 351-371 
article  
Abstract: There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates some interesting areas of application of the revised definition.; There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates some interesting areas of application of the revised definition.; There is no consensus yet on the definition of semantic information. This article contributes to the current debate by criticising & revising the Standard Definition of semantic Information (SDI) as meaningful data, in favor of the Dretske-Grice approach: meaningful & well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information; & misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information - in other words, not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarizes the main results of the article & indicates some interesting areas of application of the revised definition. 30 . Adapted from the source document; There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske‐Grice approach: meaningful and well‐formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth‐values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo‐information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo‐information. As a consequence, SDI is revised to include a necessary truth‐condition. The last section summarises the main results of the paper and indicates some interesting areas of application of the revised definition.; There is no consensus yet on the definition of semantic information. This paper contributes to the current debate by criticising and revising the Standard Definition of semantic Information (SDI) as meaningful data, in favour of the Dretske-Grice approach: meaningful and well-formed data constitute semantic information only if they also qualify as contingently truthful. After a brief introduction, SDI is criticised for providing necessary but insufficient conditions for the definition of semantic information. SDI is incorrect because truth-values do not supervene on semantic information, and misinformation (that is, false semantic information) is not a type of semantic information, but pseudo-information, that is not semantic information at all. This is shown by arguing that none of the reasons for interpreting misinformation as a type of semantic information is convincing, whilst there are compelling reasons to treat it as pseudo-information. As a consequence, SDI is revised to include a necessary truth-condition. The last section summarises the main results of the paper and indicates some interesting areas of application of the revised definition.
BibTeX:
@article{Floridi2005,
  author = {Floridi, L.},
  title = {Is Semantic Information Meaningful Data?},
  journal = {Philosophy and Phenomenological Research},
  year = {2005},
  volume = {70},
  number = {2},
  pages = {351-371}
}
Adams, F. and de Moraes, J.A. Is There a Philosophy of Information? 2016 Topoi
Vol. 35(1), pp. 161-171 
article  
Abstract: In 2002, Luciano Floridi published a paper called What is the Philosophy of Information?, where he argues for a new paradigm in philosophical research. To what extent should his proposal be accepted? Is the Philosophy of Information actually a new paradigm, in the Kuhninan sense, in Philosophy? Or is it only a new branch of Epistemology? In our discussion we will argue in defense of Floridi's proposal. We believe that Philosophy of Information has the types of features had by other areas already acknowledge as authentic in Philosophy. By way of an analogical argument we will argue that since Philosophy of Information has its own topics, method and problems it would be counter-intuitive not to accept it as a new philosophical area. To strengthen our position we present and discuss main topics of Philosophy of Information.;In 2002, Luciano Floridi published a paper called What is the Philosophy of Information?, where he argues for a new paradigm in philosophical research. To what extent should his proposal be accepted? Is the Philosophy of Information actually a new paradigm, in the Kuhninan sense, in Philosophy? Or is it only a new branch of Epistemology? In our discussion we will argue in defense of Floridi’s proposal. We believe that Philosophy of Information has the types of features had by other areas already acknowledge as authentic in Philosophy. By way of an analogical argument we will argue that since Philosophy of Information has its own topics, method and problems it would be counter-intuitive not to accept it as a new philosophical area. To strengthen our position we present and discuss main topics of Philosophy of Information.;
BibTeX:
@article{adams_is_2016,
  author = {Adams, Fred and de Moraes, João A.},
  title = {Is There a Philosophy of Information?},
  journal = {Topoi},
  year = {2016},
  volume = {35},
  number = {1},
  pages = {161--171}
}
Aguirre, A., Foster, B. and Merali, Z. It From Bit or Bit From It?: On Physics and Information 2015   book URL 
BibTeX:
@book{aguirre_it_2015,
  author = {Aguirre, Anthony and Foster, Brendan and Merali, Zeeya},
  title = {It From Bit or Bit From It?: On Physics and Information},
  publisher = {Springer International Publishing},
  year = {2015},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1BT8MgFH5xLhqNia66qnMJf6ANFFjhZOK02aJHD96W0dLEg1ts2f8XaGs6M0_kweVBHjz4wvc9AJrEOPpzJqiEayVJ6cRXdKHWUmjJJaG5wiLXvrpPD9k-9G7cI6B3CAaTDNNUDGBgH3m9XekBFqcVTqTjc9gws2mNzXijuPNrsz3xPZ9RsisYakczGMGR3gRw2V4GUbvVatvV1Vvo-gI474kHBnDiP2_m9TVMlgZl1fYLPX0atK184-2lebyBafbyPl9EzoNVC9asVDslSsdwsXaf3DfGk-GKEIaljUgduiwRWm9DOP2Qb89i8TpvzFFnxrVnbMXfJrRJyQd0NIvTW0CpZqRQONE2gBhJsEilyBPKSyztoV3mdzA-7M79fwMTOLN3CN6gEg9wbKqdnjbL-gPr4YmN}
}
Planat, M. It from Qubit: How to Draw Quantum Contextuality 2014 Information
Vol. 5(2), pp. 209-218 
article URL 
Abstract: Wheeler’s observer-participancy and the related it from bit credo refer to quantum non-locality and contextuality. The mystery of these concepts slightly starts unveiling if one encodes the (in)compatibilities between qubit observables in the relevant finite geometries. The main objective of this treatise is to outline another conceptual step forward by employing Grothendieck’s dessins d’enfants to reveal the topological and (non)algebraic machinery underlying the measurement acts and their information content.
BibTeX:
@article{planat_it_2014,
  author = {Planat, Michel},
  title = {It from Qubit: How to Draw Quantum Contextuality},
  journal = {Information},
  year = {2014},
  volume = {5},
  number = {2},
  pages = {209--218},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V27TsMwFLWACQbEUzwlTyxVwK8kDhIDUBAVE4LOlR_XVQdaVAXB52M7oXUrJsTEaqmR65xcX1-few5CnJ2TbCkmQE6cAibBGZ4LU9rCUWv8hya1UIrapcr2983MfOw_vPhe3fSMPL3rUaz7Rde4Sac7VR-Bw-k3mddOFKX6rGNH5cLFbtuelF7PN2zRaG-k6rRKQFNySWQKBXbiQp9vdzQMriRptWtGT5jHIOYziIyJxuXnHH4Ya4NonmCFpQGRVMneyppYuxy2Oa8CzzEM5T59nf0mFcdO5FCXxLEDjXAwmQ4H7ZIPVMEdt7asKmYEB5DaaeY0VZZJagScBVX1Vzsy9RWMs_7zqj-yy3Bw7z7ez-pzpCyD00DwJvz-v017Z5jsRTLVBbOHmJW8bKHN9jiBr5s5baMVGO-gjURkcheRXo0DIHAExCX2cMD1BAc44BYOeAEOe6h_f_dy-5C1PhmZ8udfmnFmDQ_-JTwH4NwJYqlVyjkqNJelMUJbvzFxw6WzhgpgyieVrhCqcqUlju-jtfFkDAcIKwKUVKCc1CCAlDr0OcvC6IKqSjJziG7icr81UiiDX63_0V885Bitz3F-gtbq6TucNlqaX7AabcA}
}
Steward, H. I—What is a Continuant? 2015 Aristotelian Society Supplementary Volume
Vol. 89(1), pp. 109-123 
article  
Abstract: In this paper, I explore the question what a continuant is, in the context of a very interesting suggestion recently made by Rowland Stout, as part of his attempt to develop a coherent ontology of processes. Stout claims that a continuant is best thought of as something that primarily has its properties at times, rather than atemporally—and that on this construal, processes should count as continuants. While accepting that Stout is onto something here, I reject his suggestion that we should accept that processes are both occurrents and continuants; nothing, I argue, can truly occur or happen (unless it is instantaneous), which does not have temporal parts. I make an alternative suggestion as to how one might deal with the peculiar status of processes without jettisoning a very natural account of occurrence; and assess the consequences for the category of continuant.;In this paper, I explore the question what a continuant is, in the context of a very interesting suggestion recently made by Rowland Stout, as part of his attempt to develop a coherent ontology of processes. Stout claims that a continuant is best thought of as something that primarily has its properties at times, rather than atemporally—and that on this construal, processes should count as continuants. While accepting that Stout is onto something here, I reject his suggestion that we should accept that processes are both occurrents and continuants; nothing, I argue, can truly occur or happen (unless it is instantaneous), which does not have temporal parts. I make an alternative suggestion as to how one might deal with the peculiar status of processes without jettisoning a very natural account of occurrence; and assess the consequences for the category of continuant.; In this paper, I explore the question what a continuant is, in the context of a very interesting suggestion recently made by Rowland Stout, as part of his attempt to develop a coherent ontology of processes. Stout claims that a continuant is best thought of as something that primarily has its properties at times, rather than atemporally–and that on this construal, processes should count as continuants. While accepting that Stout is onto something here, I reject his suggestion that we should accept that processes are both occurrents and continuants; nothing, I argue, can truly occur or happen (unless it is instantaneous), which does not have temporal parts. I make an alternative suggestion as to how one might deal with the peculiar status of processes without jettisoning a very natural account of occurrence; and assess the consequences for the category of continuant.;
BibTeX:
@article{steward_iwhat_2015,
  author = {Steward, Helen},
  title = {I—What is a Continuant?},
  journal = {Aristotelian Society Supplementary Volume},
  year = {2015},
  volume = {89},
  number = {1},
  pages = {109--123}
}
Dretske, F.I. Knowledge and the flow of information 1981   book URL 
BibTeX:
@book{dretske_knowledge_1981,
  author = {Dretske, Fred I.},
  title = {Knowledge and the flow of information},
  publisher = {Blackwell},
  year = {1981},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV3NS0MxDA9-gAw8OHX1a9CLx_dou0f6ep6OwTx48OBttL72JBviQPzvTfva4WQeQ6GkoeSX_NokABNVi-qPT3DeNZ7Q1YrY4M5pCnw7KW0rUFhs0mP6L2Z7X964U4BeGAyp0KSM_ZCSvO0YR0JdiiI0YWvusFNks9NsLyHI7AyOYlXBEA786hwGz2WAwPcF3C8KrcUpq-cUkPHwvv7i68BzV9Nou0sYzx5fpvMq7rzMpMvSZdWkGsGpjZ_VV5tU1NYxOA50szyL3p6RFgxOXs3TQztfTHtxWMT6M1Ve1R8bRuCSLmaFtb4CTifCgNi8Oewaq4QJbRcM2om2wittrmG0X52b_xZuYSBNK3t24S7rOO7t9QNr63pQ}
}
Dretske, F. Knowledge and the Flow of Information 1999   book URL 
Abstract: LCCN: 99012546
BibTeX:
@book{dretske_knowledge_1999,
  author = {Dretske, F.I.},
  title = {Knowledge and the Flow of Information},
  publisher = {Cambridge University Press},
  year = {1999},
  url = {https://books.google.com.au/books?id=GC9xQgAACAAJ}
}
Dretske, F.I. and Bernecker, S. Knowledge: readings in contemporary epistemology 2000   book URL 
BibTeX:
@book{dretske_knowledge:_2000,
  author = {Dretske, Fred I. and Bernecker, Sven},
  title = {Knowledge: readings in contemporary epistemology},
  publisher = {Oxford University Press},
  year = {2000},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwY2AwNtIz0EUrEyxSUw0sU0C3lJsmW5qkGZsZmqVapCQmmhslGVmmJlqgjmxj6zeibECHjWAYgQ6sAh24zQzs5CHlSlBiMgcd5W5hCjlGB9gqB3YVYCfuwPimKIfvgWsUN0EGFtAuAyEGptQ8YQauANiFApUiDJzesGEuUQZZN9cQZw9dkO546EBLfBLUOWZGYgwswM57qgSDgqlJqmFiWpJJqhGQAPY2LVIMzc3ME01Al30ZJJmmSjKIYTdDCpeENAMXZDs4aBhAhoE1DZhMU2UhHgEA_WJhaQ}
}
Knuth, D. Knuth: Computers and Typesetting   online URL 
BibTeX:
@online{knuthwebsite,
  author = {Donald Knuth},
  title = {Knuth: Computers and Typesetting},
  url = {http://www-cs-faculty.stanford.edu/ uno/abcde.html}
}
Grünwald, P. and Vitányi, P.M. Kolmogorov Complexity and Information Theory With an Interpretation in Terms of Questions and Answers 2003 Journal of Logic, Language and Information
Vol. 12(4), pp. 497-529 
article URL 
Abstract: We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual information and Kolmogorov ("algorithmic") mutual information. We explain how universal coding may be viewed as a middle ground between the two theories. We consider Shannon's rate distortion theory, which quantifies useful (in a certain sense) information. We use the communication of information as our guiding motif, and we explain how it relates to sequential question-answer sessions.; We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (“algorithmic”) mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of information as our guiding motif, and we explain howit relates to sequential question-answer sessions.
BibTeX:
@article{Grunwald2003,
  author = {Grünwald, P. and Paul M.B. Vitányi},
  title = {Kolmogorov Complexity and Information Theory With an Interpretation in Terms of Questions and Answers},
  journal = {Journal of Logic, Language and Information},
  year = {2003},
  volume = {12},
  number = {4},
  pages = {497-529},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3NT9swFH9CnCqhwWAdGWPyCXbJsGOncbmgMq2atB7QtGlHy4ntDakkHQ2V-O_xcz6q0nHYLZLtKMmz875-7_cAePKJxs_-Cdz6ncGZ1SLPc-pVrpOC0yKzfkBybTYj230kA0GWASUYcvreXMrn9kIgcp6n2dXib4zdozDL2rbSCEXs44Dsm1yvkR5Z6FVKx0kae-3EtlLMHRYx0Glin6OttGjQNtN96OBtHcqkTz2vi-P_gcL-z7c4gFetNUomzfZ5DTu2PIS3szaGuSRnZNbTLi8PYXDTNT54PAL3rZrfVb-r-2pF8K-CzJr1I9GlIW2NE8qcNMX_l-TXbf3HD5JNlCO59TO8cliSypEQfMVjEG4yKZfYwO0N_Jx–fH5a9x2bYgL5n2PWAuqmXWaCitGznA2SgsqC55YNE1SQ533YrzUHc_QlqOWyjwbjfNM5K5wTPMh7GlE95d1qAI0x0C4cVJax0RWeAsKOQeFlsYkeXC4mI7gw6ZElca0jfqOeVPvhCcRDIME1KLh8FDd54_gvBF5P_Kw0v4kKKTd9peqnCuk3BulEVx0e6Kfy6gUY-zkKVQrKZUyLoVI1cK4CD5urUAsYLuMJUp009-99HwnMAjwwRD0eQ-79f2DPW0oI58AXtwKgg}
}
Devlin, K. Logic and Information 1995   book URL 
BibTeX:
@book{Devlin1995,
  author = {Devlin, K.},
  title = {Logic and Information},
  publisher = {Cambridge University Press},
  year = {1995},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1Ra8IwED6mg7ExcDrNnBP6B1qaWJPm2SmC7m0Pe5M0Jo_C1u7_766NhYqDvByBcBxJLt-R7zuAhUjS-OJOkGahuNd2qQWiIul1ujRSWM7dEZGQUt3K9jXc2CGgnysYpCQiM9WDHoK8to0j8VEzap-VBYWd1u6I7dUZZPMEfWIVDOHGnUYwCG-_KJyscgQPH61-avkMY2qBbCNE-VFQNqX4jWG-WX-utjGtfgiFl0MR3MvEBB4NfVg_VTWx7cigX_38OkYXPkNHGNx96f17vt2tGnN4NpOyJl8l3xXD_FLvzVgm6gUipb02HtGbFQZBUlE4RE8899zlqVHcTmFy3ZvX_yZmcN98hKLxBrceD4CbNyH7A_lceo8}
}
Stapel, K. and Schneider, K. Managing knowledge on communication and information flow in global software projects 2012 Expert Systems
Vol. 31(3), pp. 234-252 
article  
Abstract: Communication is a key success factor of distributed software projects. Poor communication has been identified as a main obstacle to successful collaboration. Global projects are especially endangered by information gaps between collaborating sites. Different communication styles, technical equipment, and missing awareness of each other can cause severe problems. Knowledge about actual and desired channels, paths, and modes of communication is required for improving communication in a globally distributed project. However, many project participants know little about communication and information flow in their projects. In this contribution, we focus on knowledge about communication and information flow. It is acquired by modelling on-going and desired flows of information, including documented and non-documented channels of information flow. We analyzed a distributed software project from the information flow perspective. Based on the findings, we developed specific techniques to improve information flow in distributed software development according to the FLOW Method. In a second distributed project, we evaluated one of the techniques. We found the FLOW mapping technique to be suitable for effectively spreading knowledge about communication and information flow in global software projects.
BibTeX:
@article{stapel_managing_2012,
  author = {Stapel, Kai and Schneider, Kurt},
  title = {Managing knowledge on communication and information flow in global software projects},
  journal = {Expert Systems},
  year = {2012},
  volume = {31},
  number = {3},
  pages = {234--252}
}
Arendt, W. and Schleich, W. Mathematical analysis of evolution, information, and complexity 2009   book  
BibTeX:
@book{arendt_mathematical_2009,
  author = {Arendt, Wolfgang and Schleich, Wolfgang},
  title = {Mathematical analysis of evolution, information, and complexity},
  publisher = {Wiley-VCH},
  year = {2009}
}
Rovelli, C. Meaning = Information + Evolution 2016   article  
Abstract: Notions like meaning, signal, intentionality, are difficult to relate to a physical word. I study a purely physical definition of "meaningful information", from which these notions can be derived. It is inspired by a model recently illustrated by Kolchinsky and Wolpert, and improves on Dretske classic work on the relation between knowledge and information. I discuss what makes a physical process into a "signal".
BibTeX:
@article{rovelli_meaning_2016,
  author = {Rovelli, Carlo},
  title = {Meaning = Information + Evolution},
  year = {2016}
}
Godfrey-Smith, P. Meaning, Models and Selection: A Review of Philosophical Naturalism 1996 Philosophy and Phenomenological Research
Vol. 56(3), pp. 673-678 
article  
BibTeX:
@article{godfrey-smith_meaning_1996,
  author = {Godfrey-Smith, Peter},
  title = {Meaning, Models and Selection: A Review of Philosophical Naturalism},
  journal = {Philosophy and Phenomenological Research},
  year = {1996},
  volume = {56},
  number = {3},
  pages = {673--678}
}
Wolski, J. Measure of Amount of Information and Its Meaning 2010 FILOZOFIA NAUKI
Vol. 18(3), pp. 105-105 
article  
Abstract: Jacek Wolski, Measure of Amount of Information and Its Meaning There are five different conceptions of information which have been created in last sixty years. Each of them diversely define notion of information. The aim of this article is to prove that these five conceptions represents two separate trends. Every introduced conception of information is qualified either to quantitative trend, which is supported on rating of amount of information, or to semantic trend, which describe meaning of information. These trends are exclusive.
BibTeX:
@article{wolski_measure_2010,
  author = {Wolski, J.},
  title = {Measure of Amount of Information and Its Meaning},
  journal = {FILOZOFIA NAUKI},
  year = {2010},
  volume = {18},
  number = {3},
  pages = {105--105}
}
Wu Zhaoqi, Zhu Chuanxi and Zhang Xiaozhi Measurement interpretation and information measures in general probabilistic theory 2013 Open Physics
Vol. 11(3), pp. 317 
article DOI URL 
BibTeX:
@article{wu_zhaoqi_measurement_2013,
  author = {Wu Zhaoqi and Zhu Chuanxi and Zhang Xiaozhi},
  title = {Measurement interpretation and information measures in general probabilistic theory},
  journal = {Open Physics},
  year = {2013},
  volume = {11},
  number = {3},
  pages = {317},
  url = {//www.degruyter.com/view/j/phys.2013.11.issue-3/s11534-012-0169-x/s11534-012-0169-x.xml},
  doi = {http://doi.org/10.2478/s11534-012-0169-x}
}
Tononi, G. and Sporns, O. Measuring information integration 2003 BMC Neurosci
Vol. 4 
article DOI URL 
BibTeX:
@article{tononi_measuring_2003,
  author = {Tononi, G. and Sporns, O.},
  title = {Measuring information integration},
  journal = {BMC Neurosci},
  year = {2003},
  volume = {4},
  url = {http://dx.doi.org/10.1186/1471-2202-4-31},
  doi = {http://doi.org/10.1186/1471-2202-4-31}
}
Schneider, T.D. Measuring Molecular Information 1999 Journal of Theoretical Biology
Vol. 201(1), pp. 87-92 
article URL 
BibTeX:
@article{schneider_measuring_1999,
  author = {Schneider, Thomas D.},
  title = {Measuring Molecular Information},
  journal = {Journal of Theoretical Biology},
  year = {1999},
  volume = {201},
  number = {1},
  pages = {87--92},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV07T8MwELZQJSQW3o_yENmYQh07cWwx8apYusFs2clZKkOpUBn49_hsh7SILt2i6Kw4d9E9nPu-I4SzW5r_8QmNrSUUrf86jGsktJK2wq-V1khX14FZaelkO0JjUpNligTRwwffne6Mkm5H8-kUEb8MQZgcEx7MExBZLSQ2-Y2fHnoa3jIMDQw97Cjd0ThSMXpf2Cli9xQWtGxdmOohTytZaIhG4z3StUB1XSi_v6Z78Pw_Xdqbv-U-2U2Ja3Yf5Q7IFswOyXYcZfl9RK4n4bDR7zubdBN3s4R1Qtsfk7fx8-vjS56GL-QNY0rm1ram5EBVY2oFjX8qFcgVQ433SJIqWQhGrRTUGFu3XIkKsBhxgkOpXFXxEzKYfczgjGSVodyXNUUJPpdwQCWAkxx8rum8q63aIbnpdK3nkWNDRzZlodEqGq2CvWhsSIrOFHpFY9oHgLVr7pLNtEmQg9BirM3cS0a5UCr5VChe-MitvdZpXH0a7by0s4qXJZfnG-zlguxE2gc8wrkkg8XnF1xFAsgffrzvUA}
}
Stapp, H. Minds and values in the quantum universe 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 104-120  incollection  
BibTeX:
@incollection{stapp_minds_2010,
  author = {Stapp, Henry},
  title = {Minds and values in the quantum universe},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {104--120},
  note = {DOI: 10.1017/CBO9780511778759.006}
}
Sagawa, T. and Ueda, M. Minimal energy cost for thermodynamic information processing: measurement and information erasure 2009 Physical review letters
Vol. 102(25), pp. 250602 
article URL 
Abstract: The fundamental lower bounds on the thermodynamic energy cost of measurement and information erasure are determined. The lower bound on the erasure validates Landauer's principle for a symmetric memory; for other cases, the bound indicates the breakdown of the principle. Our results constitute the second law of "information thermodynamics," in which information content and thermodynamic variables are treated on an equal footing.
BibTeX:
@article{sagawa_minimal_2009,
  author = {Sagawa, Takahiro and Ueda, Masahito},
  title = {Minimal energy cost for thermodynamic information processing: measurement and information erasure},
  journal = {Physical review letters},
  year = {2009},
  volume = {102},
  number = {25},
  pages = {250602},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1La8MwDBajMNhl70f3AP-BtI6d5tHbKC279FK2c5AfgR6ajLbb758cp82yB2zXxBhiKdJnSZ8EIMWAB19sgjCRiIxBctdaRqniI8OVtiaVTsEL2Y1sA_85oR9yOXSFkQv77sguru3AQLgOec4GR3Esav7IYm-JQxF7SyxdGQJPGobw79t0nFMHZtbuZnYCu5rjXZnJPvfcsuO_l2H_-TNO4bjBoezRK84ZHNjyHA7relC9uQCcL8vlihZMa24gm1SbLSN8y0it1qvK-Dn2rCEzOeGyhnJArnDM5m3kkWFpOuum6_rdJbzMps-Tp6CZxhCgS9UE6JOmmSUMJARmBf3s2iLdzDVaWRToppbHiKlKeKi1RKVsQfc5QlCjIhRaXkGvrEp7A4xgUDZK3WSsJI1QZ8qgSk0sMp2g5jbpw3AnhvzVN93I68sKl_mng6NnIvcH14drL612veuUSPjq9t973cGRTxm5UMs99LbrN_vguzR-ABor094}
}
Dretske, F.I. Minimal rationality 2012   incollection  
BibTeX:
@incollection{dretske_minimal_2012,
  author = {Dretske, Fred I.},
  title = {Minimal rationality},
  year = {2012}
}
Long, B.R. Minimal Semantic Information of Real Structure 2013 In Preparation
Vol. N/A, pp. N/A 
article  
BibTeX:
@article{Long2013a,
  author = {Long, B. R.},
  title = {Minimal Semantic Information of Real Structure},
  journal = {In Preparation},
  year = {2013},
  volume = {N/A},
  pages = {N/A}
}
Wallace, C.S. and Dowe, D.L. Minimum Message Length and Kolmogorov Complexity 1999 The Computer Journal
Vol. 42(4), pp. 270-283 
article URL 
Abstract: The notion of algorithmic complexity was developed by Kolmogorov (1965) and Chaitin (1966) independently of one another and of Solomonoff's notion (1964) of algorithmic probability. Given a Turing machine T, the (prefix) algorithmic complexity of a string S is the length of the shortest input to T which would cause T to output S and stop. The Solomonoff probability of S given T is the probability that a random binary string of 0s and 1s will result in T producing an output having S as a prefix. We attempt to establish a parallel between a restricted (two-part) version of the Kolmogorov model and the minimum message length approach to statistical inference and machine learning of Wallace and Boulton (1968), in which an 'explanation' of a data string is modelled as a two-part message, the first part stating a general hypothesis about the data and the second encoding details of the data not implied by the hypothesis. Solomonoff's model is tailored to prediction rather than inference in that it considers not just the most likely explanation, but it also gives weights to all explanations depending upon their posterior probability. However, as the amount of data increases, we typically expect the most likely explanation to have a dominant weighting in the prediction.; The notion of algorithmic complexity was developed by Kolmogorov (1965) and Chaitin (1966) independently of one another and of Solomonoff's notion (1964) of algorithmic probability. Given a Turing machine T, the (prefix) algorithmic complexity of a string S is the length of the shortest input to T which would cause T to output S and stop. The Solomonoff probability of S given T is the probability that a random binary string of 0s and 1s will result in T producing an output having S as a prefix. We attempt to establish a parallel between a restricted (two-part) version of the Kolmogorov model and the minimum message length approach to statistical inference and machine learning of Wallace and Boulton (1968), in which an 'explanation' of a data string is modelled as a two-part message, the first part stating a general hypothesis about the data and the second encoding details of the data not implied by the hypothesis. Solomonoff's model is tailored to prediction rather than inference in that it considers not just the most likely explanation, but it also gives weights to all explanations depending upon their posterior probability. However, as the amount of data increases, we typically expect the most likely explanation to have a dominant weighting in the prediction.;
BibTeX:
@article{wallace_minimum_1999,
  author = {Wallace, C. S. and Dowe, D. L.},
  title = {Minimum Message Length and Kolmogorov Complexity},
  journal = {The Computer Journal},
  year = {1999},
  volume = {42},
  number = {4},
  pages = {270--283},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V1NS8QwEA3qyYvf4if0JMLSkqZpmx48iCiCiuCu4C0kabKs7Lay7urfd7JJ210P_gCvLQ1lZnh5M8m8QSghEQ5_YYKRLBXUyBj2AyHyQjNZCJEVDLIPQwT9VdluJkN2z_6D459G1Wgyn_Se7HCToW3YqYa-f-2hHk_qYT2tvxY4YLUwZyvHum6KnZvz0JxUt6cUUa8f9XzhveW_Ue8xAhb-rbvygZMcSEOCu2TTtSUu1bx8YxXwWz9GpClJeAQF3KaZX0E70KQZDq0M_DKqUrIUPXQZInPc7T3tjcAkp2lstdgurOT5pByp2ZWuwtf-OuTTzGbVL89v7RZL8GLwWvszq6RidU9dEIXBDtryVguunWd20Zqu9tB2Y9XAg-k-wt5RgXdU4BwVgC2CzlFB56gD9Hp3O7i5D_38ilBBFozDXEGYSwWUtATgkwLYVJkxrcoySQ2WymBLIeKS5VoBEY2lAf4mVaxEolKWyeQQbVR1pY9QQKjJUiMYNk7SrlBKagZ0rUioljI9RpeNAfiHkynh7npBwiGA36sxp4RTDsY_RmeNgbiQtp6mZp-cAM3OU5LDSoft63I85q1jTv7-7hRtduF1hjZm07k-d9qWPwMQRdE}
}
Godfrey-Smith, P. Misinformation 1989 Canadian Journal of Philosophy
Vol. 19(4), pp. 533-550 
article URL 
BibTeX:
@article{godfrey-smith_misinformation_1989,
  author = {Godfrey-Smith, Peter},
  title = {Misinformation},
  journal = {Canadian Journal of Philosophy},
  year = {1989},
  volume = {19},
  number = {4},
  pages = {533--550},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8QwEB5kTyv43tX6gP0D1SZNkxYEEXHxonjQc8hTlKXW3fXgvzfT1-Lr4CWXFDqThHlkvnwDkNLTJP5mE4IbLphV3AaHZXJFMmW15Rkx3HjMh77ebPc3GQiyrFGCdU0_hEt65s4YUpZxnl9UbzF2j8Iqa9tKI5hiQglS6NO7ojfI4Yg1pOEsi9E_9ta3QSCuw7Dq2gZ8_DDJtZ-ZbkEHbOvwJX3RefUs_hf89T_l34bNNg6dXDYHZwfWXLkLw_tewj3YuH1etLSquHkjeJxeP1zdxG33hPiJcMwwBbfeakZ1iHmIt1ynjisiXMh4mLVEqYw7ViAfDHFGYKDimaFC24Iq5206hkH5WroDmGBx1CemUKkpmKBKO54wnTHBmQ8JShrBCBdUolDLuTIyF3mIpPIIxrX2smqYM2SnegTnqyXvZ1dqmReclojnQtAdDgmT-Gw4DFkSwX63SdLOwnc05MKMBnt1-Ncfj2BYo8NqIMoxDJbzd3fSUC9-Al86yFM}
}
Scharnhorst, A., Börner, K. and Besselaar, P.v.d. Models of science dynamics: encounters between complexity theory and information sciences 2012   book  
BibTeX:
@book{scharnhorst_models_2012,
  author = {Scharnhorst, Andrea and Börner, Katy and Besselaar, Peter v. d.},
  title = {Models of science dynamics: encounters between complexity theory and information sciences},
  publisher = {Springer},
  year = {2012}
}
Aberg, J., Shtarkov, Y.M. and Smeets, B.J.M. Multialphabet coding with separate alphabet description 1997 , pp. 56-65  inproceedings URL 
Abstract: For lossless universal source coding of memoryless sequences with an a priori unknown alphabet size (multialphabet coding), the alphabet of the sequence must be described as well as the sequence itself. Usually an efficient description of the alphabet can be made only by taking into account some additional information. We show that these descriptions can be separated in such a way that the encoding of the actual sequence can be performed independently of the alphabet description, and present sequential coding methods for such sequences. Such methods have applications in coding methods where the alphabet description is made available sequentially, such as PPM.
BibTeX:
@inproceedings{aberg_multialphabet_1997,
  author = {Aberg, J. and Shtarkov, Yu M. and Smeets, B. J. M.},
  title = {Multialphabet coding with separate alphabet description},
  year = {1997},
  pages = {56--65},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnZ3PT4MwFMdfzE7zopszoi7pwetYhfLrbFj0YjTqmbTwOC1sGfP_t69dQTSZiTeahtKG0n55fe_zAMLA54sfa4LS00ihikWqAqlVvv6oVKaSsAyJh47p0LINLusd5Tsxrmjo06U52dfldilVW8j12mAF5e6QPEMr8cyAPyknuJ7a8VNvbuFRlHIhDAGSCPH0E3bg77hy5GJqeLZ8y18_8mcK5Ut82-wg-4rZfFZn4M4hnNNJdxLdx8r_dsr-z6DOYdZHALKXbm-bwAk2Uzj9Bi-cwph0qsU8X0BiAnlN4K7CPSs3dBsjIy9r0eDFkXW1FXZr1QzuVvn7w-OC-lhsLfqi4EZU6teopUEcSBlewqjZNHgFLJNKBgIj6rAok1oLzyqW4j6QSc3jVHngDZvidmjFtqo9mB99zPUf9TcwtlhZMo3cwmi_-8S5BSt-AX9IvZw}
}
Devin, M. Musings on Firewalls and the Information Paradox 2014 Galaxies
Vol. 2(2), pp. 189-198 
article URL 
Abstract: The past year has seen an explosion of new and old ideas about black hole physics. Prior to the firewall paper, the dominant picture was the thermofield model apparently implied by anti-de Sitter conformal field theory duality. While some seek a narrow responce to Almheiri, Marolf, Polchinski, and Sully (AMPS) , there are a number of competing models. One problem in the field is the ambiguity of the competing proposals. Some are equivalent while others incompatible. This paper will attempt to define and classify a few models representative of the current discussions.
BibTeX:
@article{devin_musings_2014,
  author = {Devin, Michael},
  title = {Musings on Firewalls and the Information Paradox},
  journal = {Galaxies},
  year = {2014},
  volume = {2},
  number = {2},
  pages = {189--198},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V3JTsMwELWAExJCrGKVfOotEC9JnAMHtoLEEThXXqtKXVBbRPl7xolJ3YgT4sQtipTK9ozHb6bPbxBi9CJNWjHBWQmGN4bpQmRKOquVyzicdhIQcZG6VmX7m2W0fPcfDO87N_tWnGDWLsSzDzkcziKeZHNbEcDjVJrJIoanD3IoFxGtMJDqPbUoKHSHAgGJeSUVScgTE1eu-N4N-r4hSVzoapgJy_BDAUwknIdSo_3hXYifNHITGsVCUvcGCscqqZtNtyM2Y6WnOPbD9Cig1-a7WBs7UkNtaWN7FmFvMu33wor3SqWN5JkTPGM810pprY0lpTO5zTNNOl5UfWQGen5lx8nr8zpk7MLn7XdP3aY85-VSBfP8hGbOtTCUH_Bla7gr_R4qYPKyg7ZDRoGv63HtojU73kNH1zP_H8dk9Ik7uHquS1izfZQGB8GTMW4cBIODYHAQHDkIDg5ygF679y-3j0nom5EYUuQkAVBviUgNF6nmumAF4DDYfrQwVEgC60GVgBmrVEOynQPek046SJwzISRzkrNDtCX9_YrxvLqHaY4Qhj0Fi0htrhnj2ioljC2MdbmiZQaH5jG6qQzxVmuk9H5lmZO_-JFTtLncBWdoYz59t-e1yOYXrWh4NA}
}
Kraemer, D.M. Natural probabilistic information 2015 Synthese
Vol. 192(9), pp. 2901-2919 
article  
Abstract: Natural information refers to the information carried by natural signs such as that smoke is thought to carry natural information about fire. A number of influential philosophers have argued that natural information can also be utilized in a theory of mental content. The most widely discussed account of natural information (due to Dretske, in Knowledge and the flow of information, 1981/1999) holds that it results from an extremely strong relation between sign and signified (i.e. a conditional probability of 1). Critics have responded that it is doubtful that there are many strong relations of this sort in the natural world due to variability between signs and signified. In light of this observation, a promising suggestion is that much of the interesting natural information carried by natural signs is really information with a probabilistic content. However, Dretske’s theory cannot account for this information because it would require implausible second order objective probabilities. Given the most plausible understanding of the probabilities involved here, I argue that it is only sequences of traditional natural signs (not individual signs) that carry this probabilistic information. Several implications of this idea will be explored.;Natural information refers to the information carried by natural signs such as that smoke is thought to carry natural information about fire. A number of influential philosophers have argued that natural information can also be utilized in a theory of mental content. The most widely discussed account of natural information (due to Dretske, in Knowledge and the flow of information, 1981/1999) holds that it results from an extremely strong relation between sign and signified (i.e. a conditional probability of 1). Critics have responded that it is doubtful that there are many strong relations of this sort in the natural world due to variability between signs and signified. In light of this observation, a promising suggestion is that much of the interesting natural information carried by natural signs is really information with a probabilistic content. However, Dretske's theory cannot account for this information because it would require implausible second order objective probabilities. Given the most plausible understanding of the probabilities involved here, I argue that it is only sequences of traditional natural signs (not individual signs) that carry this probabilistic information. Several implications of this idea will be explored.; Issue Title: Special Section on The Roles of Experience in A Priori Knowledge, edited by Magdalena Balcerak Jackson Natural information refers to the information carried by natural signs such as that smoke is thought to carry natural information about fire. A number of influential philosophers have argued that natural information can also be utilized in a theory of mental content. The most widely discussed account of natural information (due to Dretske, in Knowledge and the flow of information, 1981/1999 ) holds that it results from an extremely strong relation between sign and signified (i.e. a conditional probability of 1). Critics have responded that it is doubtful that there are many strong relations of this sort in the natural world due to variability between signs and signified. In light of this observation, a promising suggestion is that much of the interesting natural information carried by natural signs is really information with a probabilistic content. However, Dretske's theory cannot account for this information because it would require implausible second order objective probabilities. Given the most plausible understanding of the probabilities involved here, I argue that it is only sequences of traditional natural signs (not individual signs) that carry this probabilistic information. Several implications of this idea will be explored.;
BibTeX:
@article{kraemer_natural_2015,
  author = {Kraemer, Daniel M.},
  title = {Natural probabilistic information},
  journal = {Synthese},
  year = {2015},
  volume = {192},
  number = {9},
  pages = {2901--2919}
}
Stoltenberg, H. and Albrecht, A. No firewalls or information problem for black holes entangled with large systems 2015 Physical Review D - Particles, Fields, Gravitation and Cosmology
Vol. 91(2) 
article  
Abstract: We discuss how under certain conditions the black hole information puzzle and the (related) arguments that firewalls are a typical feature of black holes can break down. We first review the arguments of Almheiri, Marolf, Polchinski and Sully favoring firewalls, focusing on entanglements in a simple toy model for a black hole and the Hawking radiation. By introducing a large and inaccessible system entangled with the black hole (representing perhaps a de Sitter stretched horizon or inaccessible part of a landscape), we show complementarity can be restored and firewalls can be avoided throughout the black hole's evolution. Under these conditions black holes do not have an "information problem." We point out flaws in some of our earlier arguments that such entanglement might be generically present in some cosmological scenarios and call out certain ways our picture may still be realized.
BibTeX:
@article{stoltenberg_no_2015,
  author = {Stoltenberg, Henry and Albrecht, Andreas},
  title = {No firewalls or information problem for black holes entangled with large systems},
  journal = {Physical Review D - Particles, Fields, Gravitation and Cosmology},
  year = {2015},
  volume = {91},
  number = {2}
}
Réfrégier, P. Noise theory and application to physics: from fluctuations to information 2004   book  
BibTeX:
@book{refregier_noise_2004,
  author = {Réfrégier, Philippe},
  title = {Noise theory and application to physics: from fluctuations to information},
  publisher = {Springer},
  year = {2004}
}
Walleczek, J. and Grössing, G. Nonlocal Quantum Information Transfer Without Superluminal Signalling and Communication 2016 Foundations of Physics
Vol. 46(9), pp. 1208-1228 
article  
Abstract: It is a frequent assumption that—via superluminal information transfers—superluminal signals capable of enabling communication are necessarily exchanged in any quantum theory that posits hidden superluminal influences. However, does the presence of hidden superluminal influences automatically imply superluminal signalling and communication? The non-signalling theorem mediates the apparent conflict between quantum mechanics and the theory of special relativity. However, as a ‘no-go’ theorem there exist two opposing interpretations of the non-signalling constraint: foundational and operational. Concerning Bell’s theorem, we argue that Bell employed both interpretations, and that he finally adopted the operational position which is associated often with ontological quantum theory, e.g., de Broglie–Bohm theory. This position we refer to as “effective non-signalling”. By contrast, associated with orthodox quantum mechanics is the foundational position referred to here as “axiomatic non-signalling”. In search of a decisive communication-theoretic criterion for differentiating between “axiomatic” and “effective” non-signalling, we employ the operational framework offered by Shannon’s mathematical theory of communication, whereby we distinguish between Shannon signals and non-Shannon signals. We find that an effective non-signalling theorem represents two sub-theorems: (1) Non-transfer-control (NTC) theorem, and (2) Non-signification-control (NSC) theorem. Employing NTC and NSC theorems, we report that effective, instead of axiomatic, non-signalling is entirely sufficient for prohibiting nonlocal communication. Effective non-signalling prevents the instantaneous, i.e., superluminal, transfer of message-encoded information through the controlled use—by a sender-receiver pair —of informationally-correlated detection events, e.g., in EPR-type experiments. An effective non-signalling theorem allows for nonlocal quantum information transfer yet—at the same time—effectively denies superluminal signalling and communication.;It is a frequent assumption that-via superluminal information transfers-superluminal signals capable of enabling communication are necessarily exchanged in any quantum theory that posits hidden superluminal influences. However, does the presence of hidden superluminal influences automatically imply superluminal signalling and communication? The non-signalling theorem mediates the apparent conflict between quantum mechanics and the theory of special relativity. However, as a 'no-go' theorem there exist two opposing interpretations of the non-signalling constraint: foundational and operational. Concerning Bell's theorem, we argue that Bell employed both interpretations, and that he finally adopted the operational position which is associated often with ontological quantum theory, e.g., de Broglie-Bohm theory. This position we refer to as "effective non-signalling". By contrast, associated with orthodox quantum mechanics is the foundational position referred to here as "axiomatic non-signalling". In search of a decisive communication-theoretic criterion for differentiating between "axiomatic" and "effective" non-signalling, we employ the operational framework offered by Shannon's mathematical theory of communication, whereby we distinguish between Shannon signals and non-Shannon signals. We find that an effective non-signalling theorem represents two sub-theorems: (1) Non-transfer-control (NTC) theorem, and (2) Non-signification-control (NSC) theorem. Employing NTC and NSC theorems, we report that effective, instead of axiomatic, non-signalling is entirely sufficient for prohibiting nonlocal communication. Effective non-signalling prevents the instantaneous, i.e., superluminal, transfer of message-encoded information through the controlled use-by a sender-receiver pair -of informationally-correlated detection events, e.g., in EPR-type experiments. An effective non-signalling theorem allows for nonlocal quantum information transfer yet-at the same time-effectively denies superluminal signalling and communication.;It is a frequent assumption that - via superluminal information transfers - superluminal signals capable of enabling communication are necessarily exchanged in any quantum theory that posits hidden superluminal influences. However, does the presence of hidden superluminal influences automatically imply superluminal signalling and communication? The non-signalling theorem mediates the apparent conflict between quantum mechanics and the theory of special relativity. However, as a 'no-go' theorem there exist two opposing interpretations of the non-signalling constraint: foundational and operational. Concerning Bell's theorem, we argue that Bell employed both interpretations at different times. Bell finally pursued an explicitly operational position on non-signalling which is often associated with ontological quantum theory, e.g., de Broglie-Bohm theory. This position we refer to as "effective non-signalling". By contrast, associated with orthodox quantum mechanics is the foundational position referred to here as "axiomatic non-signalling". In search of a decisive communication-theoretic criterion for differentiating between "axiomatic" and "effective" non-signalling, we employ the operational framework offered by Shannon's mathematical theory of communication. We find that an effective non-signalling theorem represents two sub-theorems, which we call (1) non-transfer-control (NTC) theorem, and (2) non-signification-control (NSC) theorem. Employing NTC and NSC theorems, we report that effective, instead of axiomatic, non-signalling is entirely sufficient for prohibiting nonlocal communication. An effective non-signalling theorem allows for nonlocal quantum information transfer yet - at the same time - effectively denies superluminal signalling and communication.;
BibTeX:
@article{walleczek_nonlocal_2016,
  author = {Walleczek, Jan and Grössing, Gerhard},
  title = {Nonlocal Quantum Information Transfer Without Superluminal Signalling and Communication},
  journal = {Foundations of Physics},
  year = {2016},
  volume = {46},
  number = {9},
  pages = {1208--1228}
}
Timpson, C.G. Nonlocality and Information Flow: The Approach of Deutsch and Hayden 2005 Foundations of Physics
Vol. 35(2), pp. 313-343 
article  
BibTeX:
@article{timpson_nonlocality_2005,
  author = {Timpson, C. G.},
  title = {Nonlocality and Information Flow: The Approach of Deutsch and Hayden},
  journal = {Foundations of Physics},
  year = {2005},
  volume = {35},
  number = {2},
  pages = {313--343}
}
Parikh, M. and Schaar, J.P.v.d. Not one bit of de Sitter information 2008 Journal of High Energy Physics
Vol. 2008(09), pp. 041 
article URL 
Abstract: We formulate the information paradox in de Sitter space in terms of the no-cloning principle of quantum mechanics. We show that energy conservation puts an upper bound on the maximum entropy available to any de Sitter observer. Combined with a general result on the average information in a quantum subsystem, this guarantees that an observer in de Sitter space cannot obtain even a single bit of information from the de Sitter horizon, thereby preventing any observable violations of the quantum no-cloning principle. The result supports the notion of observer complementarity.
BibTeX:
@article{parikh_not_2008,
  author = {Parikh, Maulik and Schaar, Jan Pieter van der},
  title = {Not one bit of de Sitter information},
  journal = {Journal of High Energy Physics},
  year = {2008},
  volume = {2008},
  number = {09},
  pages = {041},
  url = {http://stacks.iop.org/1126-6708/2008/i=09/a=041}
}
Mari, L. Notes towards a qualitative analysis of information in measurement results 1999 Measurement
Vol. 25(3), pp. 183-192 
article  
BibTeX:
@article{mari_notes_1999,
  author = {Mari, Luca},
  title = {Notes towards a qualitative analysis of information in measurement results},
  journal = {Measurement},
  year = {1999},
  volume = {25},
  number = {3},
  pages = {183--192}
}
Mercado-Reyes, A., Padilla-Longoria, P. and Arroyo-Santos, A. Objects and processes: Two notions for understanding biological information 2015 Journal of theoretical biology
Vol. 380, pp. 115-122 
article URL 
BibTeX:
@article{mercado-reyes_objects_2015,
  author = {Mercado-Reyes, Agustín and Padilla-Longoria, Pablo and Arroyo-Santos, Alfonso},
  title = {Objects and processes: Two notions for understanding biological information},
  journal = {Journal of theoretical biology},
  year = {2015},
  volume = {380},
  pages = {115--122},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8QwEB5EUQRR10d8Qu7a0iatbb3prmVhBS978BaSbXvYQ123XcR_byZp3foCF3oZCGXaNJlp5vu-AeDM9Zxve0IYKMkznxdhNAl1TuEVMdaTZMwKvT8Wk68n23D9R0HfALOm-r8eMVmhkdw0JPIg4AjnSgf3S8HdwLQHNGh1zFIawszvt-gEpU50SfegBde3qJLPUvOSDP8Tdf0vr_dht0k56Z39RnqwlpcHsGmbUL4fwuhJ4VlMRWWZ0ZnlDeTVLR2_vdDS9PipqM5s6aJLg6FWvAlnmDbaqzjyCFj6MO4PnfY5xcyqWYgW5jUV6KJAF4WnL-bzY9iRiLMva8PHywhsFHpR5AQDFdEvicDWc_I4iIejvjV7relWhjTmvtZEz4hZU86NG50A9UKPKZn4eRLr7CFgCjXLYhlhq_OMK3UKVyv4eLbS6HPYRstixC5gvZ4v8ksrwPgB51TCiA}
}
Demopoulos, W. On extending "empiricism, semantics, and ontology" to the realism/instrumentalism controversy 2011 Journal of Philosophy
Vol. 108(12), pp. 647-669 
article URL 
Abstract: The concept of a linguistic framework and the distinction between internal and external questions are the central ideas of Rudolf Carnap's "Empiricism, Semantics, and Ontology." It is not uncommon to encounter the suggestion that reflection on the theoretical and experimental investigations which led to the acceptance of the atomic hypothesis undermines Carnap's distinction between these two types of question and the utility of his notion of a linguistic framework. Demepoulos believes this is a mistake. There is a natural development of the distinction and the notion of framework choice with which it is paired that is perfectly capable of accommodating this case. He shows this by bringing out a subtlety that arises in the extension of the conceptual apparatus of ESO to the realism/instrumentalism controversy.
BibTeX:
@article{demopoulos_extending_2011,
  author = {Demopoulos, William},
  title = {On extending "empiricism, semantics, and ontology" to the realism/instrumentalism controversy},
  journal = {Journal of Philosophy},
  year = {2011},
  volume = {108},
  number = {12},
  pages = {647--669},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V1Lb9NAEF4Bp0oIUV41FGlVUS6pYb3rx_rAIRhXMWrsyHYQF2T5lQopTaLWkeDfM7t-xnDomas9tpL9vDPf7M5-gxCjH4g68gmGlaVlZjKIZ3zFc42LwGoVRm6R0tStbLSy3SoM9df-B-ADf-J-j13_i1iHOqfUnS-80HO8aC5XOt351I89R1bsCWmpwI_lXhJYTuJAVgCF7vRKml9Cah-HS6H3L69MHLAOg29uGB2y2r6CaDHzroIoWMx6ilzebHfb_XpQz9cuM4g6t4OSja-D5_tzVP15AFM2RYeIUrtRm9kqN2ox0s7PEj78oOjAbZq16mYTgc26ecuhOPYoaHWlhDKpE23UmNyoeC8U02-Kn3n1qdyoy-ghpONcJuifoz5CG23pT_3L_w7HQjg23-72d__kI5J7xE_RkyZpwNMa7GP0oNw8Q0eLtgvF7-foR-DjDnV81mN-gTvELzDgjVu8z3AcYEAbN2h_HGGNB1i_QMtLN3ZmatM5Q70WKb4KpLQgNLc51_WSZbppFmQFTBFyS5aWpCC84KTUGTMBKEtslWZpSjMxEy1mZ5y9RI9TccJiU8mTmMUJwgzYFLfztMiBQxN9lRVUTzWarWyDGfC4gl6JQUzE_Khu0zyhNkRTU9cI3GnHNSnW64SaonRZKOMp6Lwe5mRXa6skNLmjCUm4Dh6CadTSrKT6VSnoZGQHLwaiCX9HQe-GAHUGo-9CQdp9zJxGGF8IQlSv7_fqN-ionzCn6FF1uy_f1qqefwAW9YAf}
}
Van Leeuwen, J. On Floridi's method of levels of abstraction 2014 Minds and Machines
Vol. 24(1), pp. 5 
article URL 
Abstract: Abstraction is arguably one of the most important methods in modern science in analysing and understanding complex phenomena. In his book The Philosophy of Information, Floridi (The philosophy of information. Oxford University Press, Oxford, 2011) presents the method of levels of abstraction as the main method of the Philosophy of Information. His discussion of abstraction as a method seems inspired by the formal methods and frameworks of computer science, in which abstraction is operationalised extensively in programming languages and design methodologies. Is it really clear what we should understand by levels of abstraction? How should they be specified? We will argue that levels of abstraction should be augmented with annotations, in order to express semantic information for them and reconcile the method of level of abstraction (LoA's) with other approaches. We discuss the extended method when applied e.g. to the analysis of abstract machines. This will lead to an example in which the number of LoA's is unbounded. © 2013 Springer Science+Business Media Dordrecht.
BibTeX:
@article{van_leeuwen_floridis_2014,
  author = {Van Leeuwen, Jan},
  title = {On Floridi's method of levels of abstraction},
  journal = {Minds and Machines},
  year = {2014},
  volume = {24},
  number = {1},
  pages = {5},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V1LS8QwEA4qHrz4fiv0pAft0iRt0h48rKJ4EcHd9Vo2L1nQrqiLf99MHt0H-geEHkoPpc0Mk8nMN9-HECWdLF2ICWXJlKwk1pSLQhAutJY8w9IYngmsxUJlO_bdp8_-g-Efm4s7gNWpkavFPziJaNcKAHyQQ250BRQ4ZGuTqOg0AnisR14AxNLD5WLKCnjE7oeDFnmCjimXZ6zMOE0ET-h008KSQvSYQf5MvsNESPDMUHTAecQpLxQdZ8uJJE9Z7gUzOzoEU26jLfPEVjHa-onpOa_yobP4NaBnYcAZGCZSUKKw-SZO-XT3ih376959OBzP02fbfKpipd2pz4BL_U2N5NeVbtJBb9ke1Es4rj_1ntu-E1yOnTH8TeyDu2HLhW9oN_HVBmSfPmfykv4mWg_WSbreEbbQkm620UYU60jC6u-gy8cmCX5x_pl4r0jGJvFeAXczXrGLBne3_Zv7NEhlpC8Y5t6EtmlaAeL19kSMqchMVRLltE8VkwWmBat0gTWw-1CbwRJtDFGSEymLIZGY7qGVZtzoA5RgrUjGRJYrLPIhliWnxFTDTBjOhSnzQ7QPv12DmeCr6naFDxHzK1G_e6aUejKpgb5cWQtLXYe6JDxtXmsQg6yp3UMYPfrzlcdobep-J2jl62OiTz1j5g8L9FuQ}
}
D’Alfonso, S. On Quantifying Semantic Information 2011 Information
Vol. 2(4), pp. 61-101 
article URL 
Abstract: The purpose of this paper is to look at some existing methods of semantic information quantification and suggest some alternatives. It begins with an outline of Bar-Hillel and Carnap’s theory of semantic information before going on to look at Floridi’s theory of strongly semantic information. The latter then serves to initiate an in-depth investigation into the idea of utilising the notion of truthlikeness to quantify semantic information. Firstly, a couple of approaches to measure truthlikeness are drawn from the literature and explored, with a focus on their applicability to semantic information quantification. Secondly, a similar but new approach to measure truthlikeness/information is presented and some supplementary points are made.
BibTeX:
@article{dalfonso_quantifying_2011,
  author = {D’Alfonso, Simon},
  title = {On Quantifying Semantic Information},
  journal = {Information},
  year = {2011},
  volume = {2},
  number = {4},
  pages = {61--101},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LS8NAEB5EQRQR6yPWBwT0mribTbLJUaulUEFEBW9hH5ObRWv7_51J0tL25C0TEjY7s7PfzO7OFwCVxCLamBOkdMJLk-aZI3zwucrRKIlIYJN4l2ysbC_P1Kxv6CvKz-9Y77yDK5qcRxP-cyQ0Hi4XVoTWTBHf1uJtvrKGPisk_Q2aDI_gsAsDw_vWbj3Ywskx7K-QA57AzcskfJ0bPszDpUjhG36x4MKugog1egofw6f3wSjqfmkQOYpVZJTlNDei0alJ6GtKiV5hLbGQua8JN1zmycMy61FYa52yThdCaaWFtyblcOIMDgwffef2KBj1AezUNE4xYOwIqB8B7H6Wz4_FaDxoxd5CjH-bOq74ZxYQVDXDPMpjfQ6hVNopX5MtmJy-pAuktK405N2JRSX6cLvQWvXdkmBUlDywcqsV5fbhgTW6fISZq5sbZL-qc4TKG0kGKkrjKBGsU2FpLChKw7MyrYXJ5cX_2rqEvXZ1V0ayuILt2XSO1y2V4h8xhq1x}
}
Brenner, J.E. On Representation in Information Theory 2011 Information
Vol. 2(4), pp. 560-578 
article URL 
Abstract: Semiotics is widely applied in theories of information. Following the original triadic characterization of reality by Peirce, the linguistic processes involved in information—production, transmission, reception, and understanding—would all appear to be interpretable in terms of signs and their relations to their objects. Perhaps the most important of these relations is that of the representation-one, entity, standing for or representing some other. For example, an index—one of the three major kinds of signs—is said to represent something by being directly related to its object. My position, however, is that the concept of symbolic representations having such roles in information, as intermediaries, is fraught with the same difficulties as in representational theories of mind. I have proposed an extension of logic to complex real phenomena, including mind and information (Logic in Reality; LIR), most recently at the 4th International Conference on the Foundations of Information Science (Beijing, August, 2010). LIR provides explanations for the evolution of complex processes, including information, that do not require any entities other than the processes themselves. In this paper, I discuss the limitations of the standard relation of representation. I argue that more realistic pictures of informational systems can be provided by reference to information as an energetic process, following the categorial ontology of LIR. This approach enables naïve, anti-realist conceptions of anti-representationalism to be avoided, and enables an approach to both information and meaning in the same novel logical framework.
BibTeX:
@article{brenner_representation_2011,
  author = {Brenner, Joseph E.},
  title = {On Representation in Information Theory},
  journal = {Information},
  year = {2011},
  volume = {2},
  number = {4},
  pages = {560--578},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LS8QwEB7EkyLiE9cH9CB4qmaTbKY5-loED4LouUxe4KWIrP_fSdpdunsSeklp6HQmNN9M5_sKoOStqDfeCcamYKQkh5EBgPSotUM7TUlwgkGFGjOqbK96atY_6CvOz–y32UBvjlRRz3LfXxPr_NVYUUgZon4nou3OWVt9xmJ9JfdZH4A-wMMrO77uB3CVuyOYHckDngMN29d9V76VAd6UFd9ddVAHyrDnld_Ap_z54_Hl3r4rUHtGa9M6yynomVwjDZ88EF4kXykRqKXMsMtUmQMkjI8ZvOtSlmDD5P1IipFWp3CHuX2925RaHLhDCqcSRsbG6PmI8VEjUvWSG28sS5InMD18rnb717GomX4n93TjtwzgYfsk9UlWXu6nOAItMNSbpVTXrF1HEahER0lEtITsbmhoVk8_9-9LmCnr8_aemovYXvx8xuvejHEPzhsoVA}
}
Kolmogorov, A.N. On tables of random numbers 1998 Theoretical Computer Science
Vol. 207, pp. 387-395 
article  
BibTeX:
@article{Kolmogorov1998,
  author = {Kolmogorov, A. N.},
  title = {On tables of random numbers},
  journal = {Theoretical Computer Science},
  year = {1998},
  volume = {207},
  pages = {387-395}
}
Rackovsky, S. and Scheraga, H.A. On the information content of protein sequences 2011 Journal of biomolecular structure & dynamics
Vol. 28(4), pp. 593-594 
article URL 
BibTeX:
@article{rackovsky_information_2011,
  author = {Rackovsky, S. and Scheraga, H. A.},
  title = {On the information content of protein sequences},
  journal = {Journal of biomolecular structure & dynamics},
  year = {2011},
  volume = {28},
  number = {4},
  pages = {593--594},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V07a8MwEBZtoNCl9N30AdqNW-sRSxo6lNKQLUOdOch6QChxMpRC_30lS5adQP5AV8sefN9xujudvg8Agp-LfC8muCrBMFIXQvGSUCG1EZjh0siiLq3ldq-z3QmI9s_-A_DzMLgYGVFbeP08ejzzb3kZVk2WRqgPZKf-Wn6nnJsFkll_1OAdRQcR-35IXqqvzU9swvaqXt4bZGjbzmLHVPft0jSpYfqI6K_MD0Mm5gPXoIP4Nwlyh8mw27WzLEah0dRvOWkQMC15ovO1XqnvV9Pki89jV0VzX0tX0yrVzwgx1JI3h292MomdmqDNDapzcBbNBt8CGBfgyDSX4CTIfP5egZd5Ax0kcAAJjJDAjYUREpgguQaL6Uf1PsujUEWukEsoc0pqLZghmClaKiukYoSwgtY1cbuHZQRbJoWy1OWaQnNDicK14XaCrbZIYXIDRs2mMXcAolJaSlxNqSeG1kQKw5GhlisXl5HVZAyy7qeX28BHskQdzSsjbuvyCZ1Lmr2mOBuD22CX9G5nvPuDKw_gtHeERzByHmaeAknlH7ZnMmE}
}
Bynum, T.W. On the Possibility of Quantum Informational Structural Realism 2014 Minds and Machines
Vol. 24(1), pp. 123-139 
article URL 
Abstract: In The Philosophy of Information, Luciano Floridi presents an ontological theory of Being qua Being, which he calls "Informational Structural Realism", a theory which applies, he says, to every possible world. He identifies primordial information ("dedomena") as the foundation of any structure in any possible world. The present essay examines Floridi's defense of that theory, as well as his refutation of "Digital Ontology" (which some people might confuse with his own). Then, using Floridi's ontology as a starting point, the present essay adds quantum features to dedomena, yielding an ontological theory for our own universe, Quantum Informational Structural Realism, which provides a metaphysical interpretation of key quantum phenomena, and diminishes the "weirdness" or "spookiness" of quantum mechanics.; In The Philosophy of Information, Luciano Floridi presents an ontological theory of Being qua Being, which he calls "Informational Structural Realism", a theory which applies, he says, to every possible world. He identifies primordial information ("dedomena") as the foundation of any structure in any possible world. The present essay examines Floridi's defense of that theory, as well as his refutation of "Digital Ontology" (which some people might confuse with his own). Then, using Floridi's ontology as a starting point, the present essay adds quantum features to dedomena, yielding an ontological theory for our own universe, Quantum Informational Structural Realism, which provides a metaphysical interpretation of key quantum phenomena, and diminishes the "weirdness" or "spookiness" of quantum mechanics.; In The Philosophy of Information, Luciano Floridi presents an ontological theory of Being qua Being, which he calls “Informational Structural Realism”, a theory which applies, he says, to every possible world. He identifies primordial information (“dedomena”) as the foundation of any structure in any possible world. The present essay examines Floridi’s defense of that theory, as well as his refutation of “Digital Ontology” (which some people might confuse with his own). Then, using Floridi’s ontology as a starting point, the present essay adds quantum features to dedomena, yielding an ontological theory for our own universe, Quantum Informational Structural Realism, which provides a metaphysical interpretation of key quantum phenomena, and diminishes the “weirdness” or “spookiness” of quantum mechanics.
BibTeX:
@article{bynum_possibility_2014,
  author = {Bynum, Terrell W.},
  title = {On the Possibility of Quantum Informational Structural Realism},
  journal = {Minds and Machines},
  year = {2014},
  volume = {24},
  number = {1},
  pages = {123--139},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3dT9swED-hgdAktAIbIWxIfoAXtEStnTr2CxLqhiqBxDfaW1Qn9guifLR94L_fnROnHfDApD40zdVpfef78N39DCB42k1e6YTcaG2M5GbUF6UrSylyO8qrnsy5Kq2PGxd2toG3OxnjuzQkKL3enre-EeZAQmcToAciEmozR1NFMn55ddvmEejl0fZ4lkiMBUJe870R_rFMjX5eod6Q2WQhTfqugfLG6LgDoUkmFKG0mel57_zbIu3_-JPr8KXxU9lRLVgbsGTHm9AJZ0CwRiV8hcOzMUMfkp0_TJpC2xf24NjFDDk2u2dNt1O94ciuPFgtAX2wS0vIi_ff4Ob49_VgmDRnMiQl-oY6oeK0vuLaVtpqpSrZLTkl9xxqVcWN0cI5K6RBTaKEcuiv6SqTomdsVzsnMrEFayOq3R9PfY9fFcGyw4VmIzJ-Ec5kBKt_9OkvNTwZ1Jcb4TKd-Ea09GkaIX_9Ok1kmm8DE1I5dK0M11R32MuVdllWOe54SRG4iuEgsLZ4rFE8ijleM01ygZNc0CQX_RgiYn5BK3z6PCoLdMm0VBi2451aHtpB-lyTB4h3fgYOzh_gx-U0cMOs-gGPlYth_w05EmbhO_jG08awtyhsLS1FsqiZc5F5Hz-G3kfIBg3qO6EdTHc–BO-w2f6uC5U_wGfUFDsbo1Z-Rc0tRzo}
}
Bell, J. On the problem of hidden vari­ables in quantum mechanics 1966 Reviews of Modern Physics
Vol. 38, pp. 447–452 
article  
BibTeX:
@article{Bell1966,
  author = {Bell, J.S.},
  title = {On the problem of hidden vari­ables in quantum mechanics},
  journal = {Reviews of Modern Physics},
  year = {1966},
  volume = {38},
  pages = {447–452}
}
Godfrey-Smith, P. On the Theoretical Role of Genetic Coding 2000 Philosophy of Science
Vol. 67, pp. 26-44 
article  
BibTeX:
@article{Godfrey-Smith2000,
  author = {Godfrey-Smith, P.},
  title = {On the Theoretical Role of Genetic Coding},
  journal = {Philosophy of Science},
  year = {2000},
  volume = {67},
  pages = {26-44}
}
Sommaruga, G. One or many concepts of information 2009
Vol. 5363Formal theories of information: from shannon to semantic information theory and general concepts of information, pp. 253-267 
incollection  
BibTeX:
@incollection{sommaruga_one_2009,
  author = {Sommaruga, Giovanni},
  title = {One or many concepts of information},
  booktitle = {Formal theories of information: from shannon to semantic information theory and general concepts of information},
  year = {2009},
  volume = {5363},
  pages = {253--267}
}
J. Schroeder, M. Ontological study of information: identity and state 2014 Kybernetes
Vol. 43(6), pp. 882-894 
article  
Abstract: Purpose - The purpose of this paper is to demonstrate that sufficiently general concept of information encompassing multi-disciplinary scientific conceptualizations of this term can be useful for a discussion of the long standing philosophical problems. Design/methodology/approach - The author is using his concepts of information and its integration along with their mathematical formalization introduced in earlier publications to describe what constitutes an object, its identity and state. The concept of information used here is defined in terms of the categorical opposition of the one-and-many which plays a central role in philosophical tradition. Its formalization is closely related to formalisms of many theories involved in scientific disciplines. These features produce a common stage for philosophical discourse and scientific analysis. Findings - The formalism based on author's concept of information opens philosophical concepts such as object, identity and state to analysis consistent with scientific methodology. The analysis, consistent with modern physical theories, such as quantum mechanics, permits resolution paradoxal aspects of object's identity for long time puzzling philosophers. Originality/value - The approach to information applied here was introduced in earlier publications, but the analysis of the problems of identity in this context is novel and unprecedented. The author hopes, that even those who prefer different conceptualizations of information can benefit from the present exposition of author's analysis by considering it an example of bridging philosophical and scientific discourse.;Purpose – The purpose of this paper is to demonstrate that sufficiently general concept of information encompassing multi-disciplinary scientific conceptualizations of this term can be useful for a discussion of the long standing philosophical problems. Design/methodology/approach – The author is using his concepts of information and its integration along with their mathematical formalization introduced in earlier publications to describe what constitutes an object, its identity and state. The concept of information used here is defined in terms of the categorical opposition of the one-and-many which plays a central role in philosophical tradition. Its formalization is closely related to formalisms of many theories involved in scientific disciplines. These features produce a common stage for philosophical discourse and scientific analysis. Findings – The formalism based on author's concept of information opens philosophical concepts such as object, identity and state to analysis consistent with scientific methodology. The analysis, consistent with modern physical theories, such as quantum mechanics, permits resolution paradoxal aspects of object's identity for long time puzzling philosophers. Originality/value – The approach to information applied here was introduced in earlier publications, but the analysis of the problems of identity in this context is novel and unprecedented. The author hopes, that even those who prefer different conceptualizations of information can benefit from the present exposition of author's analysis by considering it an example of bridging philosophical and scientific discourse.;
BibTeX:
@article{j._schroeder_ontological_2014,
  author = {J. Schroeder, Marcin},
  title = {Ontological study of information: identity and state},
  journal = {Kybernetes},
  year = {2014},
  volume = {43},
  number = {6},
  pages = {882--894}
}
Floridi, L. Open Problems in the Philosophy of Information 2004 Metaphilosophy
Vol. 35(4), pp. 554-582 
article URL 
Abstract: The philosophy of information (PI) is a new area of research with its own field of investigation and methodology. This article, based on the Herbert A. Simon Lecture of Computing and Philosophy I gave at Carnegie Mellon University in 2001, analyses the eighteen principal open problems in PI. Section 1 introduces the analysis by outlining Herbert Simon's approach to PI. Section 2 discusses some methodological considerations about what counts as a good philosophical problem. The discussion centers on Hilbert's famous analysis of the central problems in mathematics. The rest of the article is devoted to the eighteen problems. These are organized into five sections: problems in the analysis of the concept of information, in semantics, in the study of intelligence, in the relation between information and nature, and in the investigation of values.; The philosophy of information (PI) is a new area of research with its own field of investigation and methodology. This article, based on the Herbert A. Simon Lecture of Computing and Philosophy I gave at Carnegie Mellon University in 2001, analyses the eighteen principal open problems in PI. Section 1 introduces the analysis by outlining Herbert Simon's approach to PI. Section 2 discusses some methodological considerations about what counts as a good philosophical problem. The discussion centers on Hilbert's famous analysis of the central problems in mathematics. The rest of the article is devoted to the eighteen problems. These are organized into five sections: problems in the analysis of the concept of information, in semantics, in the study of intelligence, in the relation between information and nature, and in the investigation of values.
BibTeX:
@article{floridi_open_2004,
  author = {Floridi, Luciano},
  title = {Open Problems in the Philosophy of Information},
  journal = {Metaphilosophy},
  year = {2004},
  volume = {35},
  number = {4},
  pages = {554--582},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V1Na9wwEBUlCSUQStMPZ5sUfCi9FBtbH7Z1yCHZpATSQGk3kJuQLQn20N0064Xk32fGsr32QiHn3taLvCvpDaMZMe8NIYzGSbTlE3QiE0vLrLTg_XjOtBRUGiG0oaLSho1vtnvp5813_wPwWCLyre0Ts-rKGO-7jgVPXiOi5ywOg9MbW-vNwA2yy4e58STqNd6BLEc3BXzo92gGDtc3sImtd3XoIqX0jUQ6X-ilQ1rM-cCxCS_13J6RwjcMGstXbx0rfbEf5ECo2ifkV1Qz_2PmVX1qF9HtbyT-87TYIbvnv-6Q9tMdoCL3zKF20qOQYQ_ZOuvVVpwwTi2a2GD2lrxpg_rwzINxSF7ZxTuy_7PfyvckRlTCDpVwvggBlXCz2eHShQNUPpDb75ez6VXUtqqIKghoM0jnuci1Y0bLwiRaclR-SzVzsPbKclGWKeR1MrVZqR01UuRlYpkttJUQAaaWfSQHGikNi7qhPpqA7DowQxtgTBDAigLy-k7-uCiurqf-8bB7jFcNPy_-Wwewg40VR1mcH5GQMUtp4mhVQNbLKiRJO8MoE9pJiAzdhKTdxqp7L26iRklhrtBAsFUpRxFZlqnHCfniEejfoGpFVaKa5FVwSDdU_VhPyNHWsN4M4BeG0PUDGi0mrAIQTSIDk3vJsGkrZY8SDvCvWWMGL16NurmcncGnT_-e7jHZ99VaeO12Qnbqh7X97EU7nwFWKIIB}
}
Floridi, L. Outline of a Theory of Strongly Semantic Information 2004 Minds and Machines
Vol. 14(2), pp. 197-221 
article URL 
Abstract: This paper outlines a quantitative theory of strongly semantic information (TSSI) based on truth-values rather than probability distributions. The main hypothesis supported in the paper is that the classic quantitative theory of weakly semantic information (TWSI), based on probability distributions, assumes that truth-values supervene on factual semantic information, yet this principle is too weak and generates a well-known semantic paradox, whereas TSSI, according to which factual semantic information encapsulates truth, can avoid the paradox and is more in line with the standard conception of what generally counts as semantic information. After a brief introduction, section two outlines the semantic paradox implied by TWSI, analysing it in terms of an initial conflict between two requisites of a quantitative theory of semantic information. In section three, three criteria of semantic information equivalence are used to provide a taxonomy of quantitative approaches to semantic information and introduce TSSI. In section four, some further desiderata that should be fulfilled by a quantitative TSSI are explained. From section five to section seven, TSSI is developed on the basis of a calculus of truth-values and semantic discrepancy with respect to a given situation. In section eight, it is shown how TSSI succeeds in solving the paradox. Section nine summarises the main results of the paper and indicates some future developments.; This paper outlines a quantitative theory of strongly semantic information (TSSI) based on truth-values rather than probability distributions. The main hypothesis supported in the paper is that the classic quantitative theory of weakly semantic information (TWSI), based on probability distributions, assumes that truth-values supervene on factual semantic information, yet this principle is too weak and generates a well-known semantic paradox, whereas TSSI, according to which factual semantic information encapsulates truth, can avoid the paradox and is more in line with the standard conception of what generally counts as semantic information. After a brief introduction, section two outlines the semantic paradox implied by TWSI, analysing it in terms of an initial conflict between two requisites of a quantitative theory of semantic information. In section three, three criteria of semantic information equivalence are used to provide a taxonomy of quantitative approaches to semantic information and introduce TSSI. In section four, some further desiderata that should be fulfilled by a quantitative TSSI are explained. From section five to section seven, TSSI is developed on the basis of a calculus of truth-values and semantic discrepancy with respect to a given situation. In section eight, it is shown how TSSI succeeds in solving the paradox. Section nine summarises the main results of the paper and indicates some future developments.
BibTeX:
@article{floridi_outline_2004,
  author = {Floridi, Luciano},
  title = {Outline of a Theory of Strongly Semantic Information},
  journal = {Minds and Machines},
  year = {2004},
  volume = {14},
  number = {2},
  pages = {197--221},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3Na9VAEB9EQQRpbdU0VSGHHrwkzX4l2Xqqr5aWVgWfFW9hk-wWUdvXvjzQ_74z2ST9RIqQQ0Jms5vMZGZ2Z-a3AIInaXxDJ6BR4ZYzo7VSlZE8bzLpCsVTa02Oevr6yja8G1cyTn4mQ4Cy09sDNM_7rY_7n3Y8-iBnWSETtHxcJTXV8qHNImH_Mv02BhTo6GD3uIwznBQM-KP_fBTBh9ans8X8Spj0TgPVGaPdZRjiEkMSyhiZvqydv52k_T8v-QyWeoc12vYStgIP7MkqLA-bQUS9bngO8vOiJZc1OnWRiXzJP51Paa39-NffaGp_Ixd_1FFfAUUS8QKOdj98nezF_ZYMcY2eo4ydMoVuuCuM48yyurZpxZgjRKDMKOaYyLnItc4tb3IrtHTcOi4r02jbpJURL-GpodR96g-d6SaARw7_MxuQ7QvwQwbw-Ls-3Cn2Dib-cmW4TOZdHVpy1gbI1e43jbMkX4NIpY1RArswaSVdZtDpy3RhRFqh_4vCFsKG52M58xAeJS_nvMTpT4F3hRZSlu2fNoTgBplA261Qq4WwOTBovMeIRbSLpyx7FpSKYQMhy1njQnh7q0VH65sxiUPoyXF0VyVqJCem02YBwk_kQmD3IZv00O4EadCu338Ur-CJT0qiXM7X8LA9X9g3HpvyAklpE_w}
}
Gray, R.M. Pair Processes: Channels, Codes, and Couplings 2011 Entropy and Information Theory, pp. 21-60  incollection  
BibTeX:
@incollection{gray_pair_2011,
  author = {Gray, Robert M.},
  title = {Pair Processes: Channels, Codes, and Couplings},
  booktitle = {Entropy and Information Theory},
  publisher = {Springer US},
  year = {2011},
  pages = {21--60}
}
Delancey, C. Phenomenal Experience and the Measure of Information 2007 Erkenntnis (1975-)
Vol. 66(3), pp. 329-352 
article URL 
Abstract: This paper defends the hypothesis that phenomenal experiences may be very complex information states. This can explain some of our most perplexing anti-physicalist intuitions about phenomenal experience. The approach is to describe some basic facts about information in such a way as to make clear the essential oversight involved, by way illustrating how various intuitive arguments against physicalism (such as Frank Jackson's Knowledge Argument, and Thomas Nagel's Bat Argument) can be interpreted to show that phenomenal information is not different in kind from physical information, but rather is just more information than we typically attribute to our understanding of a physical theory. I clarify how this hypothesis is distinct from Nagel's claim that the theory of consciousness may be inconceivable, and then in conclusion briefly describe how these results might suggest a positive and conservative physicalist account of phenomenal experience.;This paper defends the hypothesis that phenomenal experiences may be very complex information states. This can explain some of our most perplexing anti-physicalist intuitions about phenomenal experience. The approach is to describe some basic facts about information in such a way as to make clear the essential oversight involved, by way illustrating how various intuitive arguments against physicalism (such as Frank Jackson's Knowledge Argument, and Thomas Nagel's Bat Argument) can be interpreted to show that phenomenal information is not different in kind from physical information, but rather is just more information than we typically attribute to our understanding of a physical theory. I clarify how this hypothesis is distinct from Nagel's claim that the theory of consciousness may be inconceivable, and then in conclusion briefly describe how these results might suggest a positive and conservative physicalist account of phenomenal experience.;This paper defends the hypothesis that phenomenal experiences may be very complex information states. This can explain some of our most perplexing anti-physicalist intuitions about phenomenal experience. The approach is to describe some basic facts about information in such a way as to make clear the essential oversight involved, by way illustrating how various intuitive arguments against physicalism (such as Frank Jackson’s Knowledge Argument, and Thomas Nagel’s Bat Argument) can be interpreted to show that phenomenal information is not different in kind from physical information, but rather is just more information than we typically attribute to our understanding of a physical theory. I clarify how this hypothesis is distinct from Nagel’s claim that the theory of consciousness may be inconceivable, and then in conclusion briefly describe how these results might suggest a positive and conservative physicalist account of phenomenal experience.;This paper defends the hypothesis that phenomenal experiences may be very complex information states. This can explain some of our most perplexing anti-physicalist intuitions about phenomenal experience. The approach is to describe some basic facts about information in such a way as to make clear the essential oversight involved, by way illustrating how various intuitive arguments against physicalism (such as Frank Jackson's Knowledge Argument, and Thomas Nagel's Bat Argument) can be interpreted to show that phenomenal information is not different in kind from physical information, but rather is just more information than we typically attribute to our understanding of a physical theory. I clarify how this hypothesis is distinct from Nagel's claim that the theory of consciousness may be inconceivable, and then in conclusion briefly describe how these results might suggest a positive and conservative physicalist account of phenomenal experience.;
BibTeX:
@article{delancey_phenomenal_2007,
  author = {Delancey, Craig},
  title = {Phenomenal Experience and the Measure of Information},
  journal = {Erkenntnis (1975-)},
  year = {2007},
  volume = {66},
  number = {3},
  pages = {329--352},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV07T8MwELZQJxbehfCQsrAgAonjOPaEEKJiQeoAsxW_xFClpSoD_fXcxUlKCwzsdpzcOXf33ZOQnN6kyYZM0KAlvJCOMalTrYVhJudcO7BWrRGFX_ds954MTLJssgSbmD6YS3ribmnJQcJKcTd7T3B6FEZZ21EaIIqxRVkYZd4L5BZ3YeFOghCoC26GCjqOk1cAVkvQVslyTT21QjokKq6ZoBtR00YZjXZJl_3WJaH0kelV7fwvSdr__Mg9stMaq_F9uF37ZMvVB2R73E0_-DwkbPzmamziAMtWXZPjqrYxWJbxc_BAxlMft3VPeA-OyOvo8eXhKWkHMSSGppwmJs29A8lYFdIZIXhZCSMqKXUGGLRijqUOcJEHZGmdkKnLM59ViGSK3BeS2nxIBvW0dickZt6WxlDtDSuYMRoMvtJ7qi11AFxcEZGrjv5qFvptqFVnZWSWwkw8ZJZaRuQYOaTwX1zMK6MyDtYOYKQsIsOGpP0zOnrClo6Lyk4mCuw7wBcFxy3XHVO_nYwHotdFtaQOJ8-sj8jlj-Xo9233cK7yZu3pX29yRraDnxgv5TkZLOYf7iI0g_wCs7v0Rg}
}
Floridi, L. Philosophical conceptions of information 2009
Vol. 5363Formal theories of information: from shannon to semantic information theory and general concepts of information, pp. 13-53 
incollection  
BibTeX:
@incollection{floridi_philosophical_2009,
  author = {Floridi, Luciano},
  title = {Philosophical conceptions of information},
  booktitle = {Formal theories of information: from shannon to semantic information theory and general concepts of information},
  year = {2009},
  volume = {5363},
  pages = {13--53}
}
Floridi, L. Philosophy and computing: an introduction 1999   book URL 
BibTeX:
@book{floridi_philosophy_1999,
  author = {Floridi, Luciano},
  title = {Philosophy and computing: an introduction},
  publisher = {Routledge},
  year = {1999},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV3PS8MwFH64CTIQdNNl_hj0H2hpX5K2OU_HQA8ePHgbTZMeC2o9-N-bl7asHfP4kVCS0JcvebzvCwDHKA6P9oQ0FZJTIaMu48QIXklVcY2YGXdg4F7wPchsn7o3jgTofQaDCiBRyQlM3CVvEJV-G0Ykb7hOhU5GZ0IMgMSD_Y5vlGMsx858nm621zAlCcIczmy9gKv-4YWgi8MFzN76Bwh-b2B1AEFRm6D03R0j3cJ6-_y-2YX0_X2Xp9nrbjYJLuGyoPr2uvE6OMNg2nz9WEb8wNxQGFx8qNenfPeyaeG8h9G312pFnw1zdOR_5TCNshUEReGYXWQq5pUWNkWVC17mOrFaorS5uYPl6dHc_9fwALPWw4DyEY9wXrl4set20f4ApeKDcQ}
}
Adriaans, P. and Benthem, J.v. Philosophy of information 2008
Vol. 8. 
book  
BibTeX:
@book{adriaans_philosophy_2008,
  author = {Adriaans, Pieter and Benthem, Johan v.},
  title = {Philosophy of information},
  publisher = {North-Holland},
  year = {2008},
  volume = {8.}
}
Bokulich, A. and Jaeger, G. Philosophy of quantum information and entanglement 2010   book URL 
BibTeX:
@book{bokulich_philosophy_2010,
  author = {Bokulich, Alisa and Jaeger, Gregg},
  title = {Philosophy of quantum information and entanglement},
  publisher = {Cambridge University Press},
  year = {2010},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV3Pq8IwDA5PH4I39Sn-hP4Dysxm151F8ejBu7g29eQeIoL-96Zbh0702BZaUtI0X5p8BQhxFkzfbEKkzDxVhEisUBgpK3Vi5yZRGFOiF3E1sv0JN1YK0MsIhmPLY4-kBjUGeS-nMjfDrI2Iga9PVYyt5cIz7pRtVSHfy2-UdQvqrsqgDT-UdaC5LT8UuHegkSdl6ssf4LNb_FtxvvIuXE_Cc526HRWHzAiX_Z0dizTwLkzWq91yM3UL7n1sZp96CW7YgzrjfeqDCI1lJyogBkYyCkyUoiQpD6HWliiOzQB6n-cYfhsYQbN49HaRgzH8WtZsmhSyPwDdlnU1}
}
Esfeld, M. Physicalism and Ontological Holism 1999 Metaphilosophy
Vol. 30(4), pp. 319-337 
article  
Abstract: The claim of this paper is that we should envisage physicalism as an ontological holism. Our current basic physics, quantum theory, suggests that, ontologically speaking, we have to assume one global quantum state of the world; many of the properties that are often taken to be intrinsic properties of physical systems are in fact relations, which are determined by that global quantum state. The paper elaborates on this conception of physicalism as an ontological holism and considers issues such as supervenience, realization of higher‐order properties by basic physical properties, and reduction.
BibTeX:
@article{esfeld_physicalism_1999,
  author = {Esfeld, Michael},
  title = {Physicalism and Ontological Holism},
  journal = {Metaphilosophy},
  year = {1999},
  volume = {30},
  number = {4},
  pages = {319--337}
}
Frieden B. R. & Gatenby, R. Power laws of complex systems from extreme physical information 2005 Phys. Rev. E
Vol. 72(3), pp. 036101 
article URL 
Abstract: Many complex systems obey allometric, or power, laws y=Y x(a) . Here y textgreater or = 0 is the measured value of some system attribute a , Ytextgreater or =0 is a constant, and x is a stochastic variable. Remarkably, for many living systems the exponent a is limited to values n/4 , n=0, +/-1, +/-2... Here x is the mass of a randomly selected creature in the population. These quarter-power laws hold for many attributes, such as pulse rate (n=-1) . Allometry has, in the past, been theoretically justified on a case-by-case basis. An ultimate goal is to find a common cause for allometry of all types and for both living and nonliving systems. The principle I-J=extremum of extreme physical information is found to provide such a cause. It describes the flow of Fisher information J–textgreaterI from an attribute value a on the cell level to its exterior observation y . Data y are formed via a system channel function y identical to f (x,a) , with f (x,a) to be found. Extremizing the difference I-J through variation of f (x,a) results in a general allometric law f (x,a) identical to y=Y x(a) . Darwinian evolution is presumed to cause a second extremization of I-J , now with respect to the choice of a . The solution is a=n/4 , n=0,+/-1,+/-2..., defining the particular powers of biological allometry. Under special circumstances, the model predicts that such biological systems are controlled by only two distinct intracellular information sources. These sources are conjectured to be cellular DNA and cellular transmembrane ion gradients.; Many complex systems obey allometric, or power, laws y=Yx(a). Here y textgreater= 0 is the measured value of some system attribute a, Y textgreater= 0 is a constant, and x is a stochastic variable. Remarkably, for many living systems the exponent a is limited to values n/4, n=0,+/- 1,+/- 2,... Here x is the mass of a randomly selected creature in the population. These quarter-power laws hold for many attributes, such as pulse rate (n=-1). Allometry has, in the past, been theoretically justified on a case-by-case basis. An ultimate goal is to find a common cause for allometry of all types and for both living and nonliving systems. The principle I-J=extremum of extreme physical information is found to provide such a cause. It describes the flow of Fisher information J -textgreater I from an attribute value a on the cell level to its exterior observation y. Data y are formed via a system channel function y equivalent to f(x,a), with f(x,a) to be found. Extremizing the difference I-J through variation of f(x,a) results in a general allometric law f(x,a)equivalent to y=Yx(a). Darwinian evolution is presumed to cause a second extremization of I-J, now with respect to the choice of a. The solution is a=n/4, n=0,+/- 1,+/- 2..., defining the particular powers of biological allometry. Under special circumstances, the model predicts that such biological systems are controlled by only two distinct intracellular information sources. These sources are conjectured to be cellular DNA and cellular transmembrane ion gradients.
BibTeX:
@article{Frieden2005,
  author = {Frieden, B. R. & Gatenby, R.A.},
  title = {Power laws of complex systems from extreme physical information},
  journal = {Phys. Rev. E},
  year = {2005},
  volume = {72},
  number = {3},
  pages = {036101},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1dT9YwFD4hosbEoCjOoSS90CvdWD-2rlfEvEJI1MQYTbhbun5cET7cEH4-p-32Ivhe4GXTrlt72p7T9nmeAXBWVsWdNUFz66w1VjlmuW6oY8IK18teeiqYdLdPtuHj6gt9WvHdAIz84f7slxJfw9H5R-Y6BuGBvvHtaF6FOboikbRSFU6iup4JMyuruOWUlp7oYSCGXAwrXVJ0PwfPYEbfz7CT5V30DVv-X1j2vZr1HDammJR8SoNoE9bcyQt4FLGhZngJe9_Dn9TIsb4cyKknEYTurkjSgB5IIKgQXOLDQSM5m8xOJkHWYPYt-HWw_3NxWEz_XSg0bRpVaEtZ73zLZNX31PHGqLb2tTaaitpiiOeNxP1ri6Ekk4wpai3zmlkpjTCBqPsKnuqAzz8ZI4_PZrDucTK5LDi4DPsug8dH6uvn9vDLIiU352Q5RLJZeT5maMk4F4umlK-BsF55Y02trbJC91r1-C0MNwfcswZXqhw-zDbszpJSRxd3OBXv5t7tJOtS7-aQJTPflG1CMFOpHN4nuy9zWDfgYx1rWyGwzUqobrwasYY75QSVUUAsh3d_D5hlfhQXitzjGJnnQO9TbDFptQeNgnH7vxr5Bp5EpdkIiXsLD8bfF24n6U1eA5myDTI}
}
Dretske, F.I. Précis of Knowledge and the Flow of Information 1983 Behavioral and Brain Sciences
Vol. 6(1), pp. 55-63 
article  
BibTeX:
@article{dretske_precis_1983,
  author = {Dretske, Fred I.},
  title = {Précis of Knowledge and the Flow of Information},
  journal = {Behavioral and Brain Sciences},
  year = {1983},
  volume = {6},
  number = {1},
  pages = {55--63}
}
Bywater, R.P. Prediction of protein structural features from sequence data based on Shannon entropy and Kolmogorov complexity 2015 PloS one
Vol. 10(4), pp. e0119306 
article  
Abstract: While the genome for a given organism stores the information necessary for the organism to function and flourish it is the proteins that are encoded by the genome that perhaps more than anything else characterize the phenotype for that organism. It is therefore not surprising that one of the many approaches to understanding and predicting protein folding and properties has come from genomics and more specifically from multiple sequence alignments. In this work I explore ways in which data derived from sequence alignment data can be used to investigate in a predictive way three different aspects of protein structure: secondary structures, inter-residue contacts and the dynamics of switching between different states of the protein. In particular the use of Kolmogorov complexity has identified a novel pathway towards achieving these goals.
BibTeX:
@article{bywater_prediction_2015,
  author = {Bywater, Robert P.},
  title = {Prediction of protein structural features from sequence data based on Shannon entropy and Kolmogorov complexity},
  journal = {PloS one},
  year = {2015},
  volume = {10},
  number = {4},
  pages = {e0119306}
}
Frieden, B. and Gatenby, R. Principle of maximum Fisher information from Hardy's axioms applied to statistical systems 2013 PHYSICAL REVIEW E
Vol. 88(4), pp. 042144 
article URL 
Abstract: Consider a finite-sized, multidimensional system in parameter state a. The system is either at statistical equilibrium or general nonequilibrium, and may obey either classical or quantum physics. L. Hardy's mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N = max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N = max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I = I-max. This is important because many physical laws have been derived, assuming as a working hypothesis that I = I-max. These derivations include uses of the principle of extreme physical information (EPI). Examples of such derivations were of the De Broglie wave hypothesis, quantum wave equations, Maxwell's equations, new laws of biology (e. g., of Coulomb force-directed cell development and of in situ cancer growth), and new laws of economic fluctuation and investment. That the principle I = I-max itself derives from suitably extended Hardy axioms thereby eliminates its need to be assumed in these derivations. Thus, uses of I = I-max and EPI express physics at its most fundamental level, its axiomatic basis in math.; Consider a finite-sized, multidimensional system in parameter state a. The system is either at statistical equilibrium or general nonequilibrium, and may obey either classical or quantum physics. L. Hardy's mathematical axioms provide a basis for the physics obeyed by any such system. One axiom is that the number N of distinguishable states a in the system obeys N=max. This assumes that N is known as deterministic prior knowledge. However, most observed systems suffer statistical fluctuations, for which N is therefore only known approximately. Then what happens if the scope of the axiom N=max is extended to include such observed systems? It is found that the state a of the system must obey a principle of maximum Fisher information, I=I(max). This is important because many physical laws have been derived, assuming as a working hypothesis that I=I(max). These derivations include uses of the principle of extreme physical information (EPI). Examples of such derivations were of the De Broglie wave hypothesis, quantum wave equations, Maxwell's equations, new laws of biology (e.g., of Coulomb force-directed cell development and of in situ cancer growth), and new laws of economic fluctuation and investment. That the principle I=I(max) itself derives from suitably extended Hardy axioms thereby eliminates its need to be assumed in these derivations. Thus, uses of I=I(max) and EPI express physics at its most fundamental level, its axiomatic basis in math.
BibTeX:
@article{frieden_principle_2013,
  author = {Frieden, BR and Gatenby, RA},
  title = {Principle of maximum Fisher information from Hardy's axioms applied to statistical systems},
  journal = {PHYSICAL REVIEW E},
  year = {2013},
  volume = {88},
  number = {4},
  pages = {042144},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3PS8MwFH6IKAjib139ATkIHnQzTZqsPcrY8CKIKIiXkmQJeNgmthP_fF-atTrdYZ56aGjTvPS970ve-wLAWYe2f_kEZq0S1BqlnMIAZh1H4qGt5HzYTazQ8yvbcLV4Qz-m_NonRj7Yjz5auIMTDvmAr1xHEO7LN-6eay_MMRQlQSs1w59IiLpgZuEj5oJSE4nWfGHItFgYkqrwM9iGOvu-Tjtp9qK_q-X_pmUv9Vk7sDXDpOQmTKJdWLHjPVivckNNsQ8v9_WCPJk4MlKfr6PpiIQz08lMeNWbl_hSFVKlAlwUBJtNRngJKJeUE-JrlypZaHxXEJAuDuBp0H_s3bZnRzK0FQK3pG0kVTLzvNZpqzKjJEWEo7Qw6DS5MDK1qXbSxMqmXYSeItNx3FVpNtTUIfPih7CpfOr-uKxK_IYtIFQjjWKCUUNdkhiaZo4ZxajQWirnkggua-Pkb0GCI6-oC-V5PWx5muZh2CI4CvZr2iIIYdgTFkErGLS5w7uIcHhGZQTnP03cNPCMESkm-qiKS0UQL9OsN1NX96oC5fG_en8CG8yft1FlC57Cavk-tWdBIfILtoT63A}
}
Kirpatovskii, S.I. Principles of the information theory of measurements 1974 Measurement Techniques
Vol. 17(5), pp. 655-659 
article  
BibTeX:
@article{kirpatovskii_principles_1974,
  author = {Kirpatovskii, S. I.},
  title = {Principles of the information theory of measurements},
  journal = {Measurement Techniques},
  year = {1974},
  volume = {17},
  number = {5},
  pages = {655--659}
}
Allo, P. Putting information first: Luciano Floridi and the philosophy of information 2010   book URL 
BibTeX:
@book{allo_putting_2010,
  author = {Allo, Patrick},
  title = {Putting information first: Luciano Floridi and the philosophy of information},
  publisher = {Wiley-Blackwell},
  year = {2010},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1LC8IwDA4-LoLgG-cD9geUPbquPYvDowfvo6sreFFBPfjvTboVVPRYekgITZuk-b4AxNE6WH3dCRhlyNIkShrNVSIjoZnhhsXENh5oO3LprbL9K2_8AKC7CkYs8MFORROamOS9eaW9hpkgGKgFczGGmRhPueN4cmviB9WX6-P2wcJnn5asDy2CGwygUZ6H0HNDFvza54bQ2bthA88RePuH7VP2a75TsqpvThjBjWGZbQ-b3YoE5HVRJi9q1cNoAl1FzeznuwW9Hafgp0xygY4Rap2yQiWqQMcsOZHXxaEx0oNppXZ-rSgpckwMwijA0MCDyW85s38bc-hUP-JUVlhA2-CxL5eVPV7cx3wM}
}
Nielsen, M.A. and Chuang, I.L. Quantum computation and quantum information 2000   book  
BibTeX:
@book{nielsen_quantum_2000,
  author = {Nielsen, Michael A. and Chuang, Isaac L.},
  title = {Quantum computation and quantum information},
  publisher = {Cambridge University Press},
  year = {2000}
}
Bubb, J. Quantum Entanglement and Information 2010 The Stanford Encyclopedia of Philosophy  misc URL 
BibTeX:
@misc{Bubb2010,
  author = {Bubb, J.},
  title = {Quantum Entanglement and Information},
  year = {2010},
  url = {http://plato.stanford.edu/entries/qt-entangle/}
}
Bennett, C.H. and DiVincenzo, D.P. Quantum information and computation 2000 Nature
Vol. 404(6775), pp. 247-255 
article  
Abstract: In information processing, as in physics, our classical world view provides an incomplete approximation to an underlying quantum reality. Quantum effects like interference and entanglement play no direct role in conventional information processing, but they can be harnessed to break codes, create unbreakable codes, and speed up otherwise intractable computations.; In information processing, as in physics, our classical world view provides an incomplete approximation to an underlying quantum reality. Quantum effects like interference and entanglement play no direct role in conventional information processing, but they can be harnessed to break codes, create unbreakable codes, and speed up otherwise intractable computations.;
BibTeX:
@article{bennett_quantum_2000,
  author = {Bennett, Charles H. and DiVincenzo, David P.},
  title = {Quantum information and computation},
  journal = {Nature},
  year = {2000},
  volume = {404},
  number = {6775},
  pages = {247--255}
}
Peres, A. Quantum information and general relativity 2004 Fortschritte der Physik
Vol. 52(11-12), pp. 1052-1055 
article  
Abstract: The Einstein-Podolsky-Rosen paradox (1935) is reexamined in the light of Shannon's information theory (1948). The EPR argument did not take into account that the observers' information was localized, like any other physical object. General relativity introduces new problems: there are horizons which act as one-way membranes for the propagation of quantum information, in particular black holes which act like sinks.;The Einstein-Podolsky-Rosen paradox (1935) is reexamined in the light of Shannon's information theory (1948). The EPR argument did not take into account that the observers' information was localized, like any other physical object. General relativity introduces new problems: there are horizons which act as on-way membranes for the propagation of quantum information, in particular black holes which act like sinks. (C) 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.;
BibTeX:
@article{peres_quantum_2004-1,
  author = {Peres, Asher},
  title = {Quantum information and general relativity},
  journal = {Fortschritte der Physik},
  year = {2004},
  volume = {52},
  number = {11-12},
  pages = {1052--1055}
}
Peres, A. and Terno, D.R. Quantum information and relativity theory 2004 Reviews of Modern Physics
Vol. 76(1), pp. 93-123 
article  
Abstract: This article discusses the intimate relationship between quantum mechanics, information theory, and relativity theory. Taken together these are the foundations of present-day theoretical physics, and their interrelationship is an essential part of the theory. The acquisition of information from a quantum system by an observer occurs at the interface of classical and quantum physics. The authors review the essential tools needed to describe this interface, i.e., Kraus matrices and positive-operator-valued measures. They then discuss how special relativity imposes severe restrictions on the transfer of information between distant systems and the implications of the fact that quantum entropy is not a Lorentz-covariant concept. This leads to a discussion of how it comes about that Lorentz transformations of reduced density matrices for entangled systems may not be completely positive maps. Quantum field theory is, of course, necessary for a consistent description of interactions. Its structure implies a fundamental tradeoff between detector reliability and localizability. Moreover, general relativity produces new and counterintuitive effects, particularly when black holes (or, more generally, event horizons) are involved. In this more general context the authors discuss how most of the current concepts in quantum information theory may require a reassessment.;Quantum mechanics, information theory, and relativity theory are the basic foundations of theoretical physics. The acquisition of information from a quantum system is the interface of classical and quantum physics. Essential tools for its description are Kraus matrices and positive operator valued measures (POVMs). Special relativity imposes severe restrictions on the transfer of information between distant systems. Quantum entropy is not a Lorentz covariant concept. Lorentz transformations of reduced density matrices for entangled systems may not be completely positive maps. Quantum field theory, which is necessary for a consistent description of interactions, implies a fundamental trade-off between detector reliability and localizability. General relativity produces new, counterintuitive effects, in particular when black holes (or more generally, event horizons) are involved. Most of the current concepts in quantum information theory may then require a reassessment.;
BibTeX:
@article{peres_quantum_2004,
  author = {Peres, Asher and Terno, Daniel R.},
  title = {Quantum information and relativity theory},
  journal = {Reviews of Modern Physics},
  year = {2004},
  volume = {76},
  number = {1},
  pages = {93--123}
}
Melkikh, A.V. Quantum information and the problem of mechanisms of biological evolution 2014 Bio Systems
Vol. 115, pp. 33 
article URL 
Abstract: One of the most important conditions for replication in early evolution is the de facto elimination of the conformational degrees of freedom of the replicators, the mechanisms of which remain unclear. In addition, realistic evolutionary timescales can be established based only on partially directed evolution, further complicating this issue. A division of the various evolutionary theories into two classes has been proposed based on the presence or absence of a priori information about the evolving system. A priori information plays a key role in solving problems in evolution. Here, a model of partially directed evolution, based on the learning automata theory, which includes a priori information about the fitness space, is proposed. A potential repository of such prior information is the states of biologically important molecules. Thus, the need for extended evolutionary synthesis is discussed. Experiments to test the hypothesis of partially directed evolution are proposed.
BibTeX:
@article{melkikh_quantum_2014,
  author = {Melkikh, Alexey V.},
  title = {Quantum information and the problem of mechanisms of biological evolution},
  journal = {Bio Systems},
  year = {2014},
  volume = {115},
  pages = {33},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V3JTsMwELWgEhIXxL5LPnFBqZI4dpwDhxaByhG15RoltgOhtEUlReLvGcfO0gI_wDGOItl-8XjWNwgRv-s6azIhiXhAUiJBuYgS4mXCzxTxNBt6CsdRiTXPdtXHsBn7D8A_LmGzltNrS4laVPnGpiaqbB9jguq65jf_mJa5HIaLqQRMfdpJrcR783mb27xk7H2b5JOXqkgGRMtT24PgBWsehJ-lLaacyiU6_c3chspIRx6C-CSm6LkWnx5tCUDDavFDLhsXwWsXVmMYqjVTuke6ZWIdbe6iKv7eHw6sqbtKhq29VITziF5pZvSpzEVxo2bOeLgJZjfXxvew129uYlo2qa2XYjO5TH7f71Opb-YVI6NUNka7aMdaCbhn0N1DG2q2j7ZM39CvA_RgMcYtjDFgjAFjbDHG8ww3GOunBmNcY3yIxvd3o9uBY3tiOM9gvFOHgYIFSlfmC6mD2nCAaOiKRGZZEEnXk76f0iTUzm3iKiI5Y1plphxGuRB-So5QZzafqROEQxZ6hAWS6B4CgiWR4CTjLAlYpFyZBqfoWG9FrJdSLBIR15sPb8zuxO-GEiWGKXEwfYOzP785R9vNv3eBOsViqS4N9-U3FFlKbw}
}
Pitalúa-García, D. Quantum Information Causality 2013 Phys. Rev. Lett.
Vol. 110(21), pp. 210402 
article DOI URL 
BibTeX:
@article{pitalua-garcia_quantum_2013,
  author = {Pitalúa-García, Damián},
  title = {Quantum Information Causality},
  journal = {Phys. Rev. Lett.},
  year = {2013},
  volume = {110},
  number = {21},
  pages = {210402},
  url = {http://link.aps.org/doi/10.1103/PhysRevLett.110.210402},
  doi = {http://doi.org/10.1103/PhysRevLett.110.210402}
}
DiVincenzo, D.P. and Loss, D. Quantum information is physical 1998 Superlattices and Microstructures
Vol. 23(3), pp. 419-432 
article  
Abstract: We discuss a few current developments in the use of quantum mechanically coherent systems for information processing. In each of these developments, Rolf Landauer has played a crucial role in nudging us, and other workers in the field, into asking the right questions, some of which we have been lucky enough to answer. A general overview of the key ideas of quantum error correction is given. We discuss how quantum entanglement is the key to protecting quantum states from decoherence in a manner which, in a theoretical sense, is as effective as the protection of digital data from bit noise. We also discuss five general criteria which must be satisfied to implement a quantum computer in the laboratory, and we illustrate the application of these criteria by discussing our ideas for creating a quantum computer out of the spin states of coupled quantum dots. (C) 1998 Academic Press Limited.; We discuss a few current developments in the use of quantum mechanically coherent systems for information processing. In each of these developments, Rolf Landauer has played a crucial role in nudging us, and other workers in the field, into asking the right questions, some of which we have been lucky enough to answer. A general overview of the key ideas of quantum error correction is given. We discuss how quantum entanglement is the key to protecting quantum states from decoherence in a manner which, in a theoretical sense, is as effective as the protection of digital data from bit noise. We also discuss five general criteria which must be satisfied to implement a quantum computer in the laboratory, and we illustrate the application of these criteria by discussing our ideas for creating a quantum computer out of the spin states of coupled quantum dots.
BibTeX:
@article{divincenzo_quantum_1998,
  author = {DiVincenzo, D. P. and Loss, D.},
  title = {Quantum information is physical},
  journal = {Superlattices and Microstructures},
  year = {1998},
  volume = {23},
  number = {3},
  pages = {419--432}
}
Tomamichel, M. and EBSCOhost Quantum information processing with finite resources: mathematical foundations 2016
Vol. 5 
book  
BibTeX:
@book{tomamichel_quantum_2016,
  author = {Tomamichel, Marco and EBSCOhost},
  title = {Quantum information processing with finite resources: mathematical foundations},
  publisher = {Springer},
  year = {2016},
  volume = {5}
}
Wilde, M. Quantum information theory 2013   book  
BibTeX:
@book{wilde_quantum_2013,
  author = {Wilde, Mark},
  title = {Quantum information theory},
  publisher = {Cambridge University Press},
  year = {2013}
}
Hayashi, M. Quantum Information Theory: Mathematical Foundation 2017   book  
BibTeX:
@book{hayashi_quantum_2017-1,
  author = {Hayashi, Masahito},
  title = {Quantum Information Theory: Mathematical Foundation},
  publisher = {Springer Berlin Heidelberg},
  year = {2017},
  edition = {2nd 2017.}
}
Fayngold, M. and Fayngold, V. Quantum mechanics and quantum information 2013   book  
BibTeX:
@book{fayngold_quantum_2013,
  author = {Fayngold, Moses and Fayngold, Vadim},
  title = {Quantum mechanics and quantum information},
  publisher = {Wiley},
  year = {2013}
}
Fuchs, C.A. Quantum mechanics as quantum information, mostly 2003 Journal of Modern Optics
Vol. 50(6-7), pp. 987-1023 
article  
BibTeX:
@article{fuchs_quantum_2003,
  author = {Fuchs, Christopher A.},
  title = {Quantum mechanics as quantum information, mostly},
  journal = {Journal of Modern Optics},
  year = {2003},
  volume = {50},
  number = {6-7},
  pages = {987--1023}
}
Bub, J. Quantum Mechanics is About Quantum Information 2005 Foundations of Physics
Vol. 35(4), pp. 541-560 
article URL 
Abstract: I argue that quantum mechanics is fundamentally a theory about the representation and manipulation of information, not a theory about the mechanics of nonclassical waves or particles. The notion of quantum information is to be understood as a new physical primitive—just as, following Einsteins special theory of relativity, a field is no longer regarded as the physical manifestation of vibrations in a mechanical medium, but recognized as a new physical entity in its own right.; I argue that quantum mechanics is fundamentally a theory about the representation and manipulation of information, not a theory about the mechanics of nonclassical waves or particles. The notion of quantum information is to be understood as a new physical primitive—just as, following Einstein’s special theory of relativity, a field is no longer regarded as the physical manifestation of vibrations in a mechanical medium, but recognized as a new physical entity in its own right.
BibTeX:
@article{bub_quantum_2005,
  author = {Bub, Jeffrey},
  title = {Quantum Mechanics is About Quantum Information},
  journal = {Foundations of Physics},
  year = {2005},
  volume = {35},
  number = {4},
  pages = {541--560},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB4hnpUQ0C1Nw0PKATggEiV-xMmxWqgqARJvcbOS2JYQ6lKarFT-PTN2srtdeii3TdbreD1jz-d5fAHgLMvTrT0hr03pcwB5LVSHdqPkwuWcM25VJXN70bMNbOXJWPzMpgCl37c3St8UHYRzkfp4LsFINFWk458-f1vT7qpAIIVGjxIRqimueVkPFyzTuD_fpNqQZb8RJr3UQHljdHQfpiKZKQllFZle187_m6T9H3_yAdwbcWpyGBRrF67ZxQwODnvynP86-ZO8SPzn4BjpZ3BnPr05bga3PoS7e5B9XKLklifJe0sVxngv-dEnFGoakumrsSCKFOQhfD1682V-nI5vaEgblkuWGtNyYSrWOYR1tjMlnkalMjViKidk03BZt65QneiEaVXRMt5Y2dTMONaKInd8H-42lMm_GHzFn4nghsNlZyMyhRHOawS3v9fvXlfHb-fhcne6zHpflpb9HiKUtl-1aZmpA0hK5oywbdOWzAjlWC1s1dQIRVuO6ieLGF5OgtangdNDr9mbaco1TrmmKdfnMUSkCprW-3DWdBoBT8mJlD6G50E7Vp0w3TONHSkhEOdVpeR6OB-wh612CJIlnn1lDK8muW8MhJ5Pe5ceRRwGcmocPXC7OTkFx99wqcOgY3i2qaKrtoHsiE6YhFJYDMVVms1HrnjiSBgeXXEIj2HHc9z6FKcncH04W9qngenyLyGBLVs}
}
Holevo, A.S. Quantum Systems, Channels, Information: A Mathematical Introduction 2012
Vol. 16 
book  
BibTeX:
@book{holevo_quantum_2012,
  author = {Holevo, Alexander S.},
  title = {Quantum Systems, Channels, Information: A Mathematical Introduction},
  publisher = {De Gruyter},
  year = {2012},
  volume = {16}
}
Wharton, K. Reality, no matter how you slice it 2015 It From Bit or Bit From It?, pp. 181-196  incollection  
BibTeX:
@incollection{wharton_reality_2015,
  author = {Wharton, Ken},
  title = {Reality, no matter how you slice it},
  booktitle = {It From Bit or Bit From It?},
  publisher = {Springer},
  year = {2015},
  pages = {181--196}
}
Brooks, M. Reality: It's nothing but information 2012 New Scientist
Vol. 215(2884), pp. 41 - 
article DOI URL 
Abstract: What we call reality might actually be the output of a program running on a cosmos-sized quantum computer
BibTeX:
@article{brooks_reality:_2012,
  author = {Brooks, Michael},
  title = {Reality: It's nothing but information},
  journal = {New Scientist},
  year = {2012},
  volume = {215},
  number = {2884},
  pages = {41 --},
  url = {http://www.sciencedirect.com/science/article/pii/S0262407912625196},
  doi = {http://doi.org/10.1016/S0262-4079(12)62519-6}
}
Gray, R.M. Relative Entropy 2011 Entropy and Information Theory, pp. 173-218  incollection  
BibTeX:
@incollection{gray_relative_2011,
  author = {Gray, Robert M.},
  title = {Relative Entropy},
  booktitle = {Entropy and Information Theory},
  publisher = {Springer US},
  year = {2011},
  pages = {173--218}
}
Rovelli, C. Relative information at the foundation of physics 2015 It From Bit or Bit From It?, pp. 79-86  incollection  
BibTeX:
@incollection{rovelli_relative_2015,
  author = {Rovelli, Carlo},
  title = {Relative information at the foundation of physics},
  booktitle = {It From Bit or Bit From It?},
  publisher = {Springer},
  year = {2015},
  pages = {79--86}
}
Ladyman, J. and Ross, D. Remodelling Structural Realism: Quantum Physics And The Metaphysics Of Structure 2003 Synthese
Vol. 136, pp. 31-56 
article  
BibTeX:
@article{Ladyman2003,
  author = {Ladyman, J. and Ross, D.},
  title = {Remodelling Structural Realism: Quantum Physics And The Metaphysics Of Structure},
  journal = {Synthese},
  year = {2003},
  volume = {136},
  pages = {31-56}
}
Bergstrom, C.T. and Rosvall, M. Response to commentaries on “The Transmission Sense of Information” 2011 Biology & Philosophy
Vol. 26(2), pp. 195-200 
article URL 
BibTeX:
@article{bergstrom_response_2011,
  author = {Bergstrom, Carl T. and Rosvall, Martin},
  title = {Response to commentaries on “The Transmission Sense of Information”},
  journal = {Biology & Philosophy},
  year = {2011},
  volume = {26},
  number = {2},
  pages = {195--200},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB4hEKgSKrRAmvJQDnBBJErixLGP1UKpBAfUAuJmxXF8abW7dFOJ3vpD2j_XX8KMnaS7Sw_lGGXiVz7P-DHzDQDLkzRe0wnSCtNoxol-T9RZatpUG1x5VMy0wlR69WQb8vEkY3qcDBeUTm8vhb6VjDx9slgi6mLi-0RTRRg_PPo56mKy9p7dW8ZM8Gq417ythBXLtK6fiUm0mc3PFks3prfaKmeX9p_AEC8z-KOMl9Q3YfT_-mv_R3-fwma_ZI32PMa24F473YaHPonl-TZsfBuyIZw_g8-H3ue2jbpZhCU713TajUezaXR9cYmojJx5RHjROV101JLwzEZ9XBTh5Pri6jn82P_0fXIQ97ka4oa4zmPck6O6sFlZC97khcgKrQtpWKE5N5TNo82lZlbL2lpGLIHW8KIpysowaWuh2Qt4XJNP_7RzsX8mgAcWJ2AbkFEMcFgDePRLfv0oDr5M_OPW8JgsXIBa8rsL8L-7-RvzpNqByFiT85ZIEq0pKkRgkRlpdWbcblzrEN4Pv1zNPbuHuuFxphFXOOKKRlyxEN4QKJQPUB01g2KoFilrSx5C4ARINXSndbPyZgCSMicnKmcIRkdFGMI7j6ux_lwtcpUqIuorBcf9JFPdnw5LWJOjxDyobLHsDwNilvpATae1nerB4fswN5YqXBd3gv03HBtAsiG8XQb3KEsmVKZoBhgtdcoQsruITXrCeSJa6Hbv2ISXsOFP8Mnj7xXc707P2teeLvMvG5ZHPg}
}
Rosvall, M., Bergstrom, C.T., fysik , I.f., fakulteten , T.-n. and universitet , U. Response to commentaries on “The transmission sense of information”: discussion note 2011 Biology & Philosophy
Vol. 26(2), pp. 195 
article  
BibTeX:
@article{rosvall_response_2011,
  author = {Rosvall, Martin and Bergstrom, Carl T. and fysik, Institutionen för and fakulteten, Teknisk-naturvetenskapliga and universitet, Umeå},
  title = {Response to commentaries on “The transmission sense of information”: discussion note},
  journal = {Biology & Philosophy},
  year = {2011},
  volume = {26},
  number = {2},
  pages = {195}
}
Sdrolia, C. and Bishop, J.M. Rethinking Construction: On Luciano Floridi's 'Against Digital Ontology' 2014 Minds and Machines: Journal for Artificial Intelligence, Philosophy and Cognitive Science
Vol. 24(1), pp. 89 
article  
Abstract: In the fourteenth chapter of The Philosophy of Information, Luciano Floridi puts forth a criticism of 'digital ontology' as a step toward the articulation of an 'informational structural realism'. Based on the claims made in the chapter, the present paper seeks to evaluate the distinctly Kantian scope of the chapter from a rather unconventional viewpoint: while in sympathy with the author's doubts 'against' digital philosophy, we follow a different route. We turn our attention to the concept of construction as used in the book with the hope of raising some additional questions that might contribute to a better understanding of what is at stake in Floridi's experimental epistemological response to digital ontology.
BibTeX:
@article{sdrolia_rethinking_2014,
  author = {Sdrolia, Chryssa and Bishop, J. M.},
  title = {Rethinking Construction: On Luciano Floridi's 'Against Digital Ontology'},
  journal = {Minds and Machines: Journal for Artificial Intelligence, Philosophy and Cognitive Science},
  year = {2014},
  volume = {24},
  number = {1},
  pages = {89}
}
Uspensky, V.A. and Shen, A. Review: Ming Li, Paul Vitanyi, An introduction to Kolmogorov complexity and its applications 1995 J.Symbolic Logic
Vol. 60(iss. 3), pp. 1017-1020 
article URL 
BibTeX:
@article{uspensky_review:_1995,
  author = {Uspensky, V. A. and Shen, A.},
  title = {Review: Ming Li, Paul Vitanyi, An introduction to Kolmogorov complexity and its applications},
  journal = {J.Symbolic Logic},
  year = {1995},
  volume = {60},
  number = {iss. 3},
  pages = {1017--1020},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwY2AwNtIz0EUrE8xTLYBN1-TkZAtTc3PDZLO0FKNUM4tkgzQjwxQgZYQ6sg3vN0JHI1JLk3MyU8Dz-lBmVnGOPrBpbGwO7F0YgU4DNTQ2NAL31V0RCzwsQZf68KKYgVR3uAkwwFbtw9aMwCeSEVvdMddUE-kmQQZ-aJNSwRGSBoQYmFLzhBm4feHnsRYLM3AFwG4sqBRhiIHMB1gp-AKrLQWfTB0F0PJAhbBMYDuxEshzzFPIBK1fT4EcLKtQkq_gnZ-Tm5-eX5RfpgBeg55aAWy8KyTmpShklhQrIE-CizI4uLmGOHvoonggvgByqEU86Jhp51AfqCiUAnopHuElYzEG3kTQuvu8EvD-vBQJBgVgayjNNC0tycQ80dwkxcjY0jQpFVjnpSUaJVmkmaQaSzJYkm2fFAV6pRm4IJvPQYMlMgwsJUWlqbKQoxYBjYPUnw}
}
Elias, P. Review: Rudolf Carnap, Yehoshua Bar-Hillel, An Outline of a Theory of Semantic Information 1954 J. Symbolic Logic
Vol. 19(iss. 3), pp. 230-232 
article URL 
BibTeX:
@article{elias_review:_1954,
  author = {Elias, Peter},
  title = {Review: Rudolf Carnap, Yehoshua Bar-Hillel, An Outline of a Theory of Semantic Information},
  journal = {J. Symbolic Logic},
  year = {1954},
  volume = {19},
  number = {iss. 3},
  pages = {230--232},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwY2AwNtIz0EUrEwyNzM2Sk4Cpx9zUPBVY46YYmqQAOxapKaaG5hYm4CskkUa24f1G6GhEamlyTmYKeF4fyswqztEHNo2NzYEpyBRUAhsaGxqB–quiAUelqBLfXhRzECqO9wEGGCr9mFrRuATyYit7phrqol0kyADP7RJqeAISQNCDEypecIM3L7w81iLhRm4AmA3FlSKMERB5gOsFIJKU_Jz0hScE4H6C3QUIlMz8oszShMVnBKLdD1AewRzdBQc8xT8S0tAbVGF_DSFRAXIXn4QOxhoeh7QeAXojiZQDIsyOLi5hjh76KI4P74AcqRFPOiQaedQH6golAJ6KB7hIWMxBpa8_LxUCQaFNKOkFIskY8NkYNyZJBunJgFb6EkgRmJyWlpysokkgyXZ1khRoFeagQt0gB1khESGgaWkqDRVFnK-IgB8Yc6Y}
}
Wibral, M., Wollstadt, P., Meyer, U., Pampu, N., Priesemann, V. and Vicente, R. Revisiting Wiener's principle of causality - interaction-delay reconstruction using transfer entropy and multivariate analysis on delay-weighted graphs 2012
Vol. 2012, pp. 3676-3679 
inproceedings URL 
Abstract: To understand the function of networks we have to identify the structure of their interactions, but also interaction timing, as compromised timing of interactions may disrupt network function. We demonstrate how both questions can be addressed using a modified estimator of transfer entropy. Transfer entropy is an implementation of Wiener's principle of observational causality based on information theory, and detects arbitrary linear and non-linear interactions. Using a modified estimator that uses delayed states of the driving system and independently optimized delayed states of the receiving system, we show that transfer entropy values peak if the delay of the state of the driving system equals the true interaction delay. In addition, we show how reconstructed delays from a bivariate transfer entropy analysis of a network can be used to label spurious interactions arising from cascade effects and apply this approach to local field potential (LFP) and magnetoencephalography (MEG) data.; To understand the function of networks we have to identify the structure of their interactions, but also interaction timing, as compromised timing of interactions may disrupt network function. We demonstrate how both questions can be addressed using a modified estimator of transfer entropy. Transfer entropy is an implementation of Wiener's principle of observational causality based on information theory, and detects arbitrary linear and non-linear interactions. Using a modified estimator that uses delayed states of the driving system and independently optimized delayed states of the receiving system, we show that transfer entropy values peak if the delay of the state of the driving system equals the true interaction delay. In addition, we show how reconstructed delays from a bivariate transfer entropy analysis of a network can be used to label spurious interactions arising from cascade effects and apply this approach to local field potential (LFP) and magnetoencephalography (MEG) data.
BibTeX:
@inproceedings{wibral_revisiting_2012,
  author = {Wibral, M. and Wollstadt, P. and Meyer, U. and Pampu, N. and Priesemann, V. and Vicente, R.},
  title = {Revisiting Wiener's principle of causality - interaction-delay reconstruction using transfer entropy and multivariate analysis on delay-weighted graphs},
  publisher = {IEEE},
  year = {2012},
  volume = {2012},
  pages = {3676--3679},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnZ1LaxRBEMcbEQ-K4CuaUQN9EU-z6df09FwNCXoQRBS9Df2UQNgsOxM1n8Sva1X3zO4mEQVvO9u9O8zQj39VV_2KECkWrL62JnRtso23KRjrOmdiYk77xL3G7buJ8apnm8xVFrHeSQ5Fiwv8mE_24Xo4tG7o7dlZxgra9VQ8Q0uY9RrZoFh9B8a2frf1tyC6TaLpUVZpDWZCYak2CElkX-ekL8V54f0hC2q6NtNxKGfd4fH7N0cYESYW0-2msiw7O9gdTCi5GHb2qZMHZD6znuNTNofW27T6m_Hb__X8D8neNluQftjsg4_Irbh8TO7tgA6fkF8fcxI7hljTL6dIun490NXs56fniXp7MWSTgNYUERbrknBRI8TykmazfYO6pRis_42OWXPHNUUH9fnqktploDlA8ruFaTVG-KIQVyj8Jv9R_SN7gmOgGdQ97JHPJ8efjt7WU0mI-hSFSB2a1HHf2NZbz5gHcRciS9Ip3loJk8K5IEwSotUJTEHdSMu72BkfQTg6r6N8Su5bTB1YjjnFMOwTamQQXqXWMWEVs86oJD2sTCJ5pYXnFanw5fergv_o2fSe-1VIFTm42pZFt1aGWxd0JyryrIyKTQ8hpdataCryqgyTbUs_iJ71BkxIkPHaGNWPP8e_9TMY1CTk1G__Wj-4D2jrVqk_NYEqZaBLbjYhL66VulHP__FoL8hdnAbFHfWS3IYxEA8KzPI3tXc02A}
}
Brillouin Léon, 1889.-1969. Science and information theory 1962   book  
BibTeX:
@book{brillouin_science_1962,
  author = {Brillouin, Léon, 1889-1969},
  title = {Science and information theory},
  year = {1962}
}
Frieden, R.B. Science from Fisher Information 2004   book URL 
BibTeX:
@book{Frieden2004,
  author = {Frieden, R. B},
  title = {Science from Fisher Information},
  publisher = {Cambridge University Press},
  year = {2004},
  edition = {2nd},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1LawIxEB5qi1LooVZNbSvdP7DLbhLzOK8VoT320Jska3IURP8_TrJZ6hZ7HAaSYcg8mW8CwGhR5n98QthhYLiwlC2dUJRb4UQpDWZ22mi_E_3O9rW6sQdA7zoYGHsrKeUABljkXVhldMNcMK5ZwqdiKoFWnQiFNY9epvU7HVP_0pFf9TbzxXCzfoTbAEEYw43bP8EwDmk2xwm8J0PMAiYkaz8tz9Lm06DfKSzWH9_1Jg8HblNjZmuT-IrO4MGEgfb9KQLfdgTuPL4-R0JEIHg5gdGP_lqpzWfdkuOOLI4RnVUcTgQDUHy8uSjkM2SKuqpR3pfKM-6YtaVxjJkGcwRhra_mMLsuzst_jFe4bwdXQgfiLcm4aNV0Bswcf-Q}
}
Singleton, D., Vagenas, E.C. and Zhu, T. Self-similarity, conservation of entropy/bits and the black hole information puzzle 2014 Journal of High Energy Physics
Vol. 2014(5), pp. 1-9 
article  
BibTeX:
@article{singleton_self-similarity_2014,
  author = {Singleton, Douglas and Vagenas, Elias C. and Zhu, Tao},
  title = {Self-similarity, conservation of entropy/bits and the black hole information puzzle},
  journal = {Journal of High Energy Physics},
  year = {2014},
  volume = {2014},
  number = {5},
  pages = {1--9}
}
Singleton, D., Vagenas, E.C. and Zhu, T. Self-similarity, conservation of entropy/bits and the black hole information puzzle 2015 It From Bit or Bit From It?, pp. 119-127  incollection  
BibTeX:
@incollection{singleton_self-similarity_2015,
  author = {Singleton, Douglas and Vagenas, Elias C and Zhu, Tao},
  title = {Self-similarity, conservation of entropy/bits and the black hole information puzzle},
  booktitle = {It From Bit or Bit From It?},
  publisher = {Springer},
  year = {2015},
  pages = {119--127}
}
Floridi, L. Semantic Conceptions of Information 2005 The Stanford Encyclopedia of Philosophy  electronic URL 
BibTeX:
@electronic{Floridi2005a,
  author = {Floridi, L.},
  title = {Semantic Conceptions of Information},
  journal = {The Stanford Encyclopedia of Philosophy},
  publisher = {Stanfrod Univeristy CSLI},
  year = {2005},
  url = {http://plato.stanford.edu/entries/information-semantic/}
}
Bar-Hillel, Y. and Carnap, R. Semantic Information 1953 The British Journal for the Philosophy of Science
Vol. 4(14), pp. 147-157 
article URL 
BibTeX:
@article{bar-hillel_semantic_1953,
  author = {Bar-Hillel, Yehoshua and Carnap, Rudolf},
  title = {Semantic Information},
  journal = {The British Journal for the Philosophy of Science},
  year = {1953},
  volume = {4},
  number = {14},
  pages = {147--157},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV09T8MwED2hTl2AtiDKl7IBQyCOY8eZEKqoYEN8rVFsnyUQlEJh4N9zjpPSoi5ImaIM1jm-e_Z7fgfA09Mk_pMTCoacW0wq4yxDWaVSZKiNSY3jTtbmnwsn29A2LvQiy1olWHP6BJf0C55JJQpVnE_fY987ynOsTSMNSsQsTYJL7miejqUsghWzP5NT3nDvD8G8VJEWRYl1iRlvQKsWbaUlc77590b8Cun1v4a-CesNAI0uwh_TgzWc9KF703Y0-O5Dr1nvs-i4MaU-GcDgDl9pEp5M1Fxg8hO6BQ_jy_vRVdx0VIgNY1zEllYf7U8o-qJgVjmGOfeIQjplhTbSJpxrm3GrdW5EolmmTaoqLQ3XuSNosw2dydsEdyBSHGWicuWsP5O0psqxImxE-MLqFFEM4XA5zGXlmZTylkomYdIsG8JRG_ZyGpw1ysCI81I_T2fl9SPtS-jJhzCoIzn_LIRxd_XrPeh6ajCI9Pah8_nxhQfBZPEHLHbEyQ}
}
Floridi, L. Semantic Information and the Correctness Theory of Truth 2011 Erkenntnis (1975-)
Vol. 74(2), pp. 147-175 
article URL 
Abstract: Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science's sense of "verification" and "validation") that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q that (3) is the case if and only if (4) m is a proxy of s (in the computer science's sense of "proxy") and (5) proximal access to m commutes with the distal access to s (in the category theory's sense of "commutation") and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science's technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained.; Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science's sense of "verification" and "validation"); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science's sense of "proxy") and (5) proximal access to m commutes with the distal access to s (in the category theory's sense of "commutation"); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science's technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained.; Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science's sense of "verification" and "validation"); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science's sense of "proxy") and (5) proximal access to m commutes with the distal access to s (in the category theory's sense of "commutation"); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science's technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained.[PUBLICATION ABSTRACT]; Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science's sense of "verification" and "validation"); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science's sense of "proxy") and (5) proximal access to m commutes with the distal access to s (in the category theory's sense of "commutation"); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science's technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained.; Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science’s sense of “verification” and “validation”); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science’s sense of “proxy”) and (5) proximal access to m commutes with the distal access to s (in the category theory’s sense of “commutation”); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science’s technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained.
BibTeX:
@article{floridi_semantic_2011,
  author = {Floridi, Luciano},
  title = {Semantic Information and the Correctness Theory of Truth},
  journal = {Erkenntnis (1975-)},
  year = {2011},
  volume = {74},
  number = {2},
  pages = {147--175},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3di9QwEB9EQQ5EPfV69QP6oC9i1yZt0-RJZPU40AfBE3wLSZugsHfu7XZB_3tnmrS3u9yBPpZOStKZTH6T-QIo-azI93RC6YwRvubey6L00kmraiUqW0nHpOFm92Z7usmgIMshSnDw6SNcsgv3tmJVI4Rk75aXOXWPIi9rbKWBqphKlIVW5pNCjnYXJe7kZALtHEFREYdgxF2YSXkhm_WWi_Taw2k4iE4ewBj5NgagTF7pq7z5awK0_3OBD-F-BKrZ-yBZh3DLXTyCgy9j54M_j0F-defIl59tFnOaiMeZuegyBJXZnNp-tD1p0iwUAMh–exstel_PIFvJx_P5qd5bMSQt2ihoI3JbClU11ISrWHMectkJzrc-o2rlfKI6Tg3dWFRV3bWOs4LYdCSs0Xlra_r8gjuGQrYpzkhhO4SuONxd7mETrwEf2ECd7-rzx_k6ad5eDwcH2frIftsdtknyM_hJ-Ri1hxDVhZWSlaixlGmYs4aqbgzTjHuBCoymcLrkc16GUp36KsizZSfpFEaNBmgGokTEgRN27pfmVbzSlIL5EalcDRwaPrGyB4cMgqL7hYLTTnFUuAk8M2rIDvTGK7XXBcaP1c3lFsoat3_7m-mUwhkEdsihgh0x3t0JQJTcnHj7JK9VzWF2uLgFN6Mcru1fFo1BUDoKD5h-cvO01z2yQkcxjFNhXMj2hRebu-BiXa4pxWyLNTgMEqB_QvZPBaipwIM_dOb_vQzOAhX-RT69xxu96uNexHqZv4F1VJQ_A}
}
Floridi, L. Semantic information and the network theory of account 2012 Synthese
Vol. 184(3), pp. 431-454 
article URL 
Abstract: The article addresses the problem of how semantic information can be upgraded to knowledge. The introductory section explains the technical terminology and the relevant background. Section 2 argues that, for semantic information to be upgraded to knowledge, it is necessary and sufficient to be embedded in a network of questions and answers that correctly accounts for it. Section 3 shows that an information flow network of type A fulfils such a requirement, by warranting that the erotetic deficit, characterising the target semantic information t by default, is correctly satisfied by the information flow of correct answers provided by an informational source s. Section 4 illustrates some of the major advantages of such a Network Theory of Account (NTA) and clears the ground of a few potential difficulties. Section 5 clarifies why NTA and an informational analysis of knowledge, according to which knowledge is accounted semantic information, is not subject to Gettier-type counterexamples. A concluding section briefly summarises the results obtained.; The article addresses the problem of how semantic information can be upgraded to knowledge. The introductory section explains the technical terminology and the relevant background. Section 2 argues that, for semantic information to be upgraded to knowledge, it is necessary and sufficient to be embedded in a network of questions and answers that correctly accounts for it. Section 3 shows that an information flow network of type A fulfils such a requirement, by warranting that the erotetic deficit, characterising the target semantic information t by default, is correctly satisfied by the information flow of correct answers provided by an informational source s. Section 4 illustrates some of the major advantages of such a Network Theory of Account (NTA) and clears the ground of a few potential difficulties. Section 5 clarifies why NTA and an informational analysis of knowledge, according to which knowledge is accounted semantic information, is not subject to Gettier-type counterexamples. A concluding section briefly summarises the results obtained.[PUBLICATION ABSTRACT];The article addresses the problem of how semantic information can be upgraded to knowledge. The introductory section explains the technical terminology and the relevant background. Section 2 argues that, for semantic information to be upgraded to knowledge, it is necessary and sufficient to be embedded in a network of questions and answers that correctly accounts for it. Section 3 shows that an information flow network of type A fulfils such a requirement, by warranting that the erotetic deficit, characterising the target semantic information t by default, is correctly satisfied by the information flow of correct answers provided by an informational source s. Section 4 illustrates some of the major advantages of such a Network Theory of Account (NTA) and clears the ground of a few potential difficulties. Section 5 clarifies why NTA and an informational analysis of knowledge, according to which knowledge is accounted semantic information, is not subject to Gettier-type counterexamples. A concluding section briefly summarises the results obtained.;
BibTeX:
@article{floridi_semantic_2012,
  author = {Floridi, Luciano},
  title = {Semantic information and the network theory of account},
  journal = {Synthese},
  year = {2012},
  volume = {184},
  number = {3},
  pages = {431--454},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV07T8MwED6hTl2Alld4SJ4QIFIlThPHI0JUjIgCq-XYjoQoaVXK0H_PObEbaBlgjS-2chffw3f3GSChgyhc0wkGN52JJCszraM4L7mJdZKVecRUVpi6e-zbyTbQ1UlG9TbwCcpab7etbzGlttInCnlOMQxCLYymyhb1PY5fVnkEDAYcMCMPWZ4yn9f8bYYflmldP28kSmv7M9oB3xfj605Wyei2XX6zLvsf37UL2841JTfNv9SDLVP1ofvg7zpY9qHnNMEHuXBw1Zd7kI3NO4rnVREHwmpFTWSlCfqWpGrKzEndMLkk05LI5n6KfXge3T3d3ofuNoZQWZseappSw6VUqU6KCAXKTd2lijs4QacGA7eYsVJLnlGNvrCJU1ZKKhW6SIkaapUcQKeaVuYISBpRlUepMtxIdChUYSF3Csa5zlOJLmkAV14SYtaAbogWXtkySCCDhGWQGAZwaGUl7Dcu5lIJak9mLHAejnjxCT2Z4EBu8b3Q5Qng2nO_XaCe1xYHCMfoZoGZLgM43yBHQurewSBZJDXx8V8JT6Brnzfl36fQWcw_zVmDBPkFV67qYA}
}
Carnap, R. and & Bar-Hillel, Y. Semantic Information. 1953 The British Journal for the Philosophy of Science
Vol. 4, pp. 147-157 
article  
BibTeX:
@article{Carnap1953,
  author = {Carnap, R., & Bar-Hillel, Y.},
  title = {Semantic Information.},
  journal = {The British Journal for the Philosophy of Science},
  year = {1953},
  volume = {4},
  pages = {147-157}
}
Hoffmeyer, J. Semiotic freedom: an emerging force 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 185-204  incollection  
BibTeX:
@incollection{hoffmeyer_semiotic_2010,
  author = {Hoffmeyer, Jesper},
  title = {Semiotic freedom: an emerging force},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {185--204},
  note = {DOI: 10.1017/CBO9780511778759.010}
}
Godfrey-Smith, P. Senders, receivers, and genetic information: comments on Bergstrom and Rosvall 2011 Biology & Philosophy
Vol. 26(2), pp. 177-181 
article  
BibTeX:
@article{godfrey-smith_senders_2011,
  author = {Godfrey-Smith, Peter},
  title = {Senders, receivers, and genetic information: comments on Bergstrom and Rosvall},
  journal = {Biology & Philosophy},
  year = {2011},
  volume = {26},
  number = {2},
  pages = {177--181}
}
Fu, T., Abbasi, A., Zeng, D. and Chen, H. Sentimental Spidering: Leveraging Opinion Information in Focused Crawlers 2012 ACM Transactions on Information Systems (TOIS)
Vol. 30(4), pp. 1-30 
article  
Abstract: Despite the increased prevalence of sentiment-related information on the Web, there has been limited work on focused crawlers capable of effectively collecting not only topic-relevant but also sentiment-relevant content. In this article, we propose a novel focused crawler that incorporates topic and sentiment information as well as a graph-based tunneling mechanism for enhanced collection of opinion-rich Web content regarding a particular topic. The graph-based sentiment (GBS) crawler uses a text classifier that employs both topic and sentiment categorization modules to assess the relevance of candidate pages. This information is also used to label nodes in web graphs that are employed by the tunneling mechanism to improve collection recall. Experimental results on two test beds revealed that GBS was able to provide better precision and recall than seven comparison crawlers. Moreover, GBS was able to collect a large proportion of the relevant content after traversing far fewer pages than comparison methods. GBS outperformed comparison methods on various categories of Web pages in the test beds, including collection of blogs, Web forums, and social networking Web site content. Further analysis revealed that both the sentiment classification module and graph-based tunneling mechanism played an integral role in the overall effectiveness of the GBS crawler.;Despite the increased prevalence of sentiment-related information on the Web, there has been limited work on focused crawlers capable of effectively collecting not only topic-relevant but also sentiment-relevant content. In this article, we propose a novel focused crawler that incorporates topic and sentiment information as well as a graph-based tunneling mechanism for enhanced collection of opinion-rich Web content regarding a particular topic. The graph-based sentiment (GBS) crawler uses a text classifier that employs both topic and sentiment categorization modules to assess the relevance of candidate pages. This information is also used to label nodes in web graphs that are employed by the tunneling mechanism to improve collection recall. Experimental results on two test beds revealed that GBS was able to provide better precision and recall than seven comparison crawlers. Moreover, GBS was able to collect a large proportion of the relevant content after traversing far fewer pages than comparison methods. GBS outperformed comparison methods on various categories of Web pages in the test beds, including collection of blogs, Web forums, and social networking Web site content. Further analysis revealed that both the sentiment classification module and graph-based tunneling mechanism played an integral role in the overall effectiveness of the GBS crawler.;Despite the increased prevalence of sentiment-related information on the Web, there has been limited work on focused crawlers capable of effectively collecting not only topic-relevant but also sentiment-relevant content. In this article, we propose a novel focused crawler that incorporates topic and sentiment information as well as a graph-based tunneling mechanism for enhanced collection of opinion-rich Web content regarding a particular topic. The graph-based sentiment (GBS) crawler uses a text classifier that employs both topic and sentiment categorization modules to assess the relevance of candidate pages. This information is also used to label nodes in web graphs that are employed by the tunneling mechanism to improve collection recall. Experimental results on two test beds revealed that GBS was able to provide better precision and recall than seven comparison crawlers. Moreover, GBS was able to collect a large proportion of the relevant content after traversing far fewer pages than comparison methods. GBS outperformed comparison methods on various categories of Web pages in the test beds, including collection of blogs, Web forums, and social networking Web site content. Further analysis revealed that both the sentiment classification module and graph-based tunneling mechanism played an integral role in the overall effectiveness of the GBS crawler.;
BibTeX:
@article{fu_sentimental_2012,
  author = {Fu, Tianjun and Abbasi, Ahmed and Zeng, Daniel and Chen, Hsinchun},
  title = {Sentimental Spidering: Leveraging Opinion Information in Focused Crawlers},
  journal = {ACM Transactions on Information Systems (TOIS)},
  year = {2012},
  volume = {30},
  number = {4},
  pages = {1--30}
}
Anan K., G.B. and Severini., S. Shannon and von Neumann entropy of random networks with heterogeneous expected degree 2011 PHYSICAL REVIEW E
Vol. 83, pp. 036109 
article  
BibTeX:
@article{Anan2011,
  author = {Anan, K., Ginestra Bianconi and Simone Severini.},
  title = {Shannon and von Neumann entropy of random networks with heterogeneous expected degree},
  journal = {PHYSICAL REVIEW E},
  year = {2011},
  volume = {83},
  pages = {036109}
}
Cole, C. Shannon revisited: Information in terms of uncertainty 1999 Journal of the American Society for Information Science
Vol. 44(4), pp. 204-211 
article  
Abstract: Shannon's theory of communication is discussed from the point of view of his concept of uncertainty. It is suggested that there are two information concepts in Shannon, two different uncertainties, and at least two different entropy concepts. Information science focuses on the uncertainty associated with the transmission of the signal rather than the uncertainty associated with the selection of a message from a set of possible messages. The author believes the latter information concept, which is from the sender's point of view, has more to say to information science about what information is than the former, which is from the receiver's point of view and is mainly concerned with “noise” reduction.
BibTeX:
@article{Cole1999,
  author = {Cole, C.},
  title = {Shannon revisited: Information in terms of uncertainty},
  journal = {Journal of the American Society for Information Science},
  year = {1999},
  volume = {44},
  number = {4},
  pages = {204-211}
}
Skyrms, B. Signals: evolution, learning, & information 2010   book URL 
BibTeX:
@book{skyrms_signals:_2010-1,
  author = {Skyrms, Brian},
  title = {Signals: evolution, learning, & information},
  publisher = {Oxford University Press},
  year = {2010},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1NT4QwEJ2oG42JB1fd-rUJJ28gUKSt19XNJnLTg7dNGYrxshsF__92KKCsepw0LU1TZoY38x4APA5Cf8sncBs1EuR5kpioLJUyAiMkrSyMBcrcDJFt6PD9rpmjr_B-c9B_NztvV_8Vv9VIf76-X1drUpINiY4sSbON6g-kXGg_m3bl4H1uHDi3ObmKnACPmxH2RqwS0Qr39Mv9tO343UDTrwlU82MYGWIvjGHHrE6AZS0aWXk3XtYLKFensP_8_kbKyWcwnT–zBY-rbRssZxlzimh4angEzjS1AO_qhuuXMFgVNpphlEQYfapDA5eVfYgF08zZ447M6gaQlfwUTMbs5r77qeBOAeviFEpzAuNpUmMKnTKdYw2xUKpTan4BUz-3s7lfwNXcOjq7QRaXMNe_fllpu54Nnp5j5g}
}
Barwise, J. and & Etchemendy, J. Situation Theory and Its Applications 1990 , pp. 34-77  inbook  
BibTeX:
@inbook{Barwise1990,
  author = {Barwise, J., & Etchemendy, J.},
  title = {Situation Theory and Its Applications},
  publisher = {CLSI Publications/University of Chicago Press},
  year = {1990},
  pages = {34-77}
}
Floridi, L. and Taddeo, M. Solving the symbol grounding problem: a critical review of fifteen years of research 2005 Journal of Experimental & Theoretical Artificial Intelligence
Vol. 17(4), pp. 419-445 
article URL 
Abstract: This article reviews eight proposed strategies for solving the symbol grounding problem (SGP), which was given its classic formulation in Harnad (1990). After a concise introduction, the paper provides an analysis of the requirement that must be satisfied by any hypothesis seeking to solve the SGP, the zero semantical commitment condition. It is then used to assess the eight strategies, which are organized into three main approaches: representationalism, semi-representationalism and non-representationalism. The conclusion is that all the strategies are semantically committed and hence that none of them provides a valid solution to the SGP, which remains an open problem.
BibTeX:
@article{floridi_solving_2005,
  author = {Floridi, Luciano and Taddeo, Mariarosaria},
  title = {Solving the symbol grounding problem: a critical review of fifteen years of research},
  journal = {Journal of Experimental & Theoretical Artificial Intelligence},
  year = {2005},
  volume = {17},
  number = {4},
  pages = {419--445},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LT8MwDLbQTlwYTzEeUs5IhaZZl5QbQkzcKRK3Km1igTRWxIrE_j1O04wNBodd27pV68b259ifAURyGUc_bILVXCuuCYlxLTMSVCi5ETFihUa1xExLmW0IUxZdkaVD1OhpI1rL7Za6LmehPu6KYoREkSVOHXyg8KNl_hTSlffl43xhlcmzcc-7RxaAi6eww7nuDis-aoXBdMn3jPsQ9qFDzcliI_q7VX5NTfYm77QLO12Aym78H7UHW3a6D_0w_IF1tuAA8od64pIRjCJINpu_lvWEuRaRtkuGdXNqrplmVTdMgfkuGVYjwxdsCD-zOS2zmTvQUQ49H8Lj-C6_vY-6EQ1RxVWcRhQwSIMJptxkiA7LlNZUErmVksCvViVFdKqqpEhkaTKCQyodaWNtIkn_shRH0JvWU3sMTBtlJKHVlBDQkHyrtqmJRyUKMaQzmRzARVBK8eaZOAoeCE5_fq8BxMtqK5o2_YF-Vsnvy4vmsxnA8B-RP590spnYKWx7ElhXGHMGveb9w557KsgvULDvcg}
}
Montanaro, A. Some applications of hypercontractive inequalities in quantum information theory 2012 Journal of Mathematical Physics
Vol. 53(12), pp. 122206 
article  
Abstract: Hypercontractive inequalities have become important tools in theoretical computer science and have recently found applications in quantum computation. In this note we discuss how hypercontractive inequalities, in various settings, can be used to obtain (fairly) concise proofs of several results in quantum information theory: a recent lower bound of Lancien and Winter on the bias achievable by local measurements which are 4-designs; spectral concentration bounds for k-local Hamiltonians; and a recent result of Pellegrino and Seoane-Sepúlveda giving general lower bounds on the classical bias obtainable in multiplayer XOR games.;Hypercontractive inequalities have become important tools in theoretical computer science and have recently found applications in quantum computation. In this note we discuss how hypercontractive inequalities, in various settings, can be used to obtain (fairly) concise proofs of several results in quantum information theory: a recent lower bound of Lancien and Winter on the bias achievable by local measurements which are 4-designs; spectral concentration bounds for k-local Hamiltonians; and a recent result of Pellegrino and Seoane-Sepulveda giving general lower bounds on the classical bias obtainable in multiplayer XOR games. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4769269];Hypercontractive inequalities have become important tools in theoretical computer science and have recently found applications in quantum computation. In this note we discuss how hypercontractive inequalities, in various settings, can be used to obtain (fairly) concise proofs of several results in quantum information theory: a recent lower bound of Lancien and Winter on the bias achievable by local measurements which are 4-designs; spectral concentration bounds for k-local Hamiltonians; and a recent result of Pellegrino and Seoane-Sepulveda giving general lower bounds on the classical bias obtainable in multiplayer XOR games.;
BibTeX:
@article{montanaro_applications_2012,
  author = {Montanaro, Ashley},
  title = {Some applications of hypercontractive inequalities in quantum information theory},
  journal = {Journal of Mathematical Physics},
  year = {2012},
  volume = {53},
  number = {12},
  pages = {122206}
}
Jafarkhani, H. Space-time coding: theory and practice 2005   book  
BibTeX:
@book{jafarkhani_space-time_2005,
  author = {Jafarkhani, Hamid},
  title = {Space-time coding: theory and practice},
  publisher = {Cambridge University Press},
  year = {2005}
}
Asselmeyer-Maluga, T. Spacetime Weave—Bit as the Connection Between Its or the Informational Content of Spacetime 2015 It From Bit or Bit From It?, pp. 129-142  incollection  
BibTeX:
@incollection{asselmeyer-maluga_spacetime_2015,
  author = {Asselmeyer-Maluga, Torsten},
  title = {Spacetime Weave—Bit as the Connection Between Its or the Informational Content of Spacetime},
  booktitle = {It From Bit or Bit From It?},
  publisher = {Springer},
  year = {2015},
  pages = {129--142}
}
Ainsworth, P. Structural Realism 2010 Sciverse Topics  electronic URL 
BibTeX:
@electronic{Ainsworth2010,
  author = {Ainsworth, P.},
  title = {Structural Realism},
  journal = {Sciverse Topics},
  year = {2010},
  url = {http://www.scitopics.com/StructuralRealism.html}
}
Bueno, O. Structuralism and Information 2010 Metaphilosophy
Vol. 41(3), pp. 365-379 
article  
Abstract: According to Luciano Floridi (2008), informational structural realism provides a framework to reconcile the two main versions of realism about structure: the epistemic formulation (according to which all we can know is structure) and the ontic version (according to which structure is all there is). The reconciliation is achieved by introducing suitable levels of abstraction and by articulating a conception of structural objects in information-theoretic terms. In this essay, I argue that the proposed reconciliation works at the expense of realism. I then propose an alternative framework, in terms of partial structures, that offers a way of combining information and structure in a realist setting while still preserving the distinctive features of the two formulations of structural realism. Suitably interpreted, the proposed framework also makes room for an empiricist form of informational structuralism (structural empiricism). Pluralism then emerges.
BibTeX:
@article{bueno_structuralism_2010,
  author = {Bueno, Otávio},
  title = {Structuralism and Information},
  journal = {Metaphilosophy},
  year = {2010},
  volume = {41},
  number = {3},
  pages = {365--379}
}
Takahashi, M. Studies in relativity and quantum information theory 2013   phdthesis  
Abstract: This thesis explores a broad range of topics in the foundations of relativity and quantum information theory. The first and main topic of this thesis is on relativistic quantum information theory. Here we construct a reformulation of quantum information, which is consistent with relativity theory. We will see that by providing a rigorous formulation starting with the field equations for a massive fermion and a photon we can construct a theory for relativistic quantum information. In particular we provide a measurement formalism, a transport equation which describes the unitary evolution of a state through spacetime as well as how to extend this to multipartite systems. The second topic concerns the nature of time, duration and clocks in current physical theories and in particular for Newtonian mechanics. We analyse the relationship between the readings of clocks in Newtonian mechanics with absolute time. We will see that in order to answer this question we must provide not only a model for a clock but also solve what is referred to as Newton’s Scholium problem. We then compare this with other dynamical theories in particular quantum mechanics and general relativity where the treatment of time is quite different from Newtonian mechanics. The final topic is rather different from the first two. In this chapter we investigate a range of methods to perform tomography in a solid-state qubit device, for which a priori initialization and measurement of the qubit is restricted to a single basis of the Bloch sphere. We explore and compare several methods to acquire precise descriptions of additional states and measurements, quantifying both stochastic and systematic errors, ultimately leading to a tomographically-complete set that can be subsequently used in process tomography. We focus in detail on the example of a spin qubit formed by the singlet-triplet subspace of two electron spins in a GaAs double quantum dot, although our approach is quite general.
BibTeX:
@phdthesis{takahashi_studies_2013,
  author = {Takahashi, Maki},
  title = {Studies in relativity and quantum information theory},
  year = {2013}
}
Dretske, F. Supervenience and the causal explanation of behavior 2015   incollection  
BibTeX:
@incollection{dretske_supervenience_2015,
  author = {Dretske, Fred},
  title = {Supervenience and the causal explanation of behavior},
  year = {2015}
}
Neander, K. Teleological Theories of Mental Content 2012 The Stanford Encyclopedia of Philosophy  incollection URL 
BibTeX:
@incollection{neander_teleological_2012,
  author = {Neander, Karen},
  title = {Teleological Theories of Mental Content},
  booktitle = {The Stanford Encyclopedia of Philosophy},
  publisher = {Metaphysics Research Lab, Stanford University},
  year = {2012},
  edition = {Spring 2012},
  url = {https://plato.stanford.edu/entries/content-teleological/#3.1}
}
Nanay, B. Teleosemantics without Etiology 2014 Philosophy of Science
Vol. 81(5), pp. 798-810 
article  
Abstract: The aim of teleosemantics is to give a scientifically respectable or 'naturalistic' theory of mental content. This paper focuses on one of the key concepts of teleosemantics: biological function. It has been universally accepted in the teleosemantics literature that the account of biological function one should use to flesh out teleosemantics is that of etiological function. My claim is that if we replace this concept of function with an alternative one and if we also restrict the scope of teleosemantics, we can arrive at an account of biologizing mental content that is much less problematic than the previous attempts.;The aim of teleosemantics is to give a scientifically respectable or ‘naturalistic’ theory of mental content. This paper focuses on one of the key concepts of teleosemantics: biological function. It has been universally accepted in the teleosemantics literature that the account of biological function one should use to flesh out teleosemantics is that of etiological function. My claim is that if we replace this concept of function with an alternative one and if we also restrict the scope of teleosemantics, we can arrive at an account of biologizing mental content that is much less problematic than the previous attempts.;The aim of teleosemantics is to give a scientifically respectable or 'naturalistic' theory of mental content. This paper focuses on one of the key concepts of teleosemantics: biological function. It has been universally accepted in the teleosemantics literature that the account of biological function one should use to flesh out teleosemantics is that of etiological function. My claim is that if we replace this concept of function with an alternative one and if we also restrict the scope of teleosemantics, we can arrive at an account of biologizing mental content that is much less problematic than the previous attempts.;
BibTeX:
@article{nanay_teleosemantics_2014,
  author = {Nanay, Bence},
  title = {Teleosemantics without Etiology},
  journal = {Philosophy of Science},
  year = {2014},
  volume = {81},
  number = {5},
  pages = {798--810}
}
Greco, J. Testimonial Knowledge and the Flow of Information 2015   incollection  
Abstract: This chapter reviews a number of related problems in the epistemology of testimony, and suggests some dilemmas for any theory of knowledge that tries to solve them. Here a common theme emerges: It can seem that any theory must make testimonial knowledge either too hard or too easy, and that therefore no adequate account of testimonial knowledge is possible. The chapter then puts forward a proposal for making progress. Specifically, an important function of the concept of knowledge is to govern the acquisition and distribution of quality information within an epistemic community. Testimonial exchanges paradigmatically serve in the distribution role, but sometimes serve in the acquisition role. The resulting position, it is argued, explains why testimonial knowledge is sometimes easy to get, and sometimes much harder.
BibTeX:
@incollection{greco_testimonial_2015,
  author = {Greco, John},
  title = {Testimonial Knowledge and the Flow of Information},
  publisher = {Oxford University Press},
  year = {2015}
}
Dade-Robertson, M. The architecture of information: architecture, interaction design and the patterning of digital information 2011   book  
BibTeX:
@book{dade-robertson_architecture_2011,
  author = {Dade-Robertson, Martyn},
  title = {The architecture of information: architecture, interaction design and the patterning of digital information},
  publisher = {Routledge},
  year = {2011}
}
Shannon, C. The bandwagon (Edtl.) 1956 IRE Transactions on Information Theory
Vol. 2(1), pp. 3-3 
article  
BibTeX:
@article{shannon_bandwagon_1956,
  author = {Shannon, C.},
  title = {The bandwagon (Edtl.)},
  journal = {IRE Transactions on Information Theory},
  year = {1956},
  volume = {2},
  number = {1},
  pages = {3--3}
}
Abel, D.L. The biosemiosis of prescriptive information 2009 Semiotica
Vol. 2009(174), pp. 1-19 
article  
Abstract: Exactly how do the sign/symbol/token systems of endo- and exo-biosemiosis differ from those of cognitive semiosis? Do the biological messages that integrate metabolism have conceptual meaning? Semantic information has two subsets: Descriptive and Prescriptive. Prescriptive information instructs or directly produces nontrivial function. In cognitive semiosis, prescriptive information requires anticipation and "choice with intent" at bona fide decision nodes. Prescriptive information either tells us what choices to make, or it is a recordation of wise choices already made. Symbol systems allow recordation of deliberate choices and the transmission of linear digital prescriptive information. Formal symbol selection can be instantiated into physicality using physical symbol vehicles (tokens). Material symbol systems (MSS) formally assign representational meaning to physical objects. Even verbal semiosis instantiates meaning into physical sound waves using an MSS. Formal function can also be incorporated into physicality through the use of dynamically-inert (dynamically-incoherent or -decoupled) configurable switch-settings in conceptual circuits. This article examines the degree to which biosemiosis conforms to the essential formal criteria of prescriptive semiosis and cybernetic management.
BibTeX:
@article{abel_biosemiosis_2009,
  author = {Abel, David L.},
  title = {The biosemiosis of prescriptive information},
  journal = {Semiotica},
  year = {2009},
  volume = {2009},
  number = {174},
  pages = {1--19}
}
Müller, M.P., Oppenheim, J. and Dahlsten, O.C.O. The black hole information problem beyond quantum theory 2012 Journal of High Energy Physics
Vol. 2012(9), pp. 1-32 
article  
Abstract: The origin of black hole entropy and the black hole information problem provide important clues for trying to piece together a quantum theory of gravity. Thus far, discussions on this topic have mostly assumed that in a consistent theory of gravity and quantum mechanics, quantum theory will be unmodified. Here, we examine the black hole information problem in the context of generalisations of quantum theory. In particular, we examine black holes in the setting of generalised probabilistic theories, in which quantum theory and classical probability theory are special cases. We compute the time it takes information to escape a black hole, assuming that information is preserved. We find that under some very general assumptions, the arguments of Page (that information should escape the black hole after half the Hawking photons have been emitted), and the black-hole mirror result of Hayden and Preskill (that information can escape quickly) need to be modified. The modification is determined entirely by what we call the Wootters-Hardy parameter associated with a theory. We find that although the information leaves the black hole after enough photons have been emitted, it is fairly generic that it fails to appear outside the black hole at this point — something impossible in quantum theory due to the no-hiding theorem. The information is neither inside the black hole, nor outside it, but is delocalised. Our central technical result is an information decoupling theorem which holds in the generalised probabilistic framework.;The origin of black hole entropy and the black hole information problem provide important clues for trying to piece together a quantum theory of gravity. Thus far, discussions on this topic have mostly assumed that in a consistent theory of gravity and quantum mechanics, quantum theory will be unmodified. Here, we examine the black hole information problem in the context of generalisations of quantum theory. In particular, we examine black holes in the setting of generalised probabilistic theories, in which quantum theory and classical probability theory are special cases. We compute the time it takes information to escape a black hole, assuming that information is preserved. We find that under some very general assumptions, the arguments of Page (that information should escape the black hole after half the Hawking photons have been emitted), and the black-hole mirror result of Hayden and Preskill (that information can escape quickly) need to be modified. The modification is determined entirely by what we call the Wootters-Hardy parameter associated with a theory. We find that although the information leaves the black hole after enough photons have been emitted, it is fairly generic that it fails to appear outside the black hole at this point something impossible in quantum theory due to the no-hiding theorem. The information is neither inside the black hole, nor outside it, but is delocalised. Our central technical result is an information decoupling theorem which holds in the generalised probabilistic framework.;
BibTeX:
@article{muller_black_2012,
  author = {Müller, Markus P. and Oppenheim, Jonathan and Dahlsten, Oscar C. O.},
  title = {The black hole information problem beyond quantum theory},
  journal = {Journal of High Energy Physics},
  year = {2012},
  volume = {2012},
  number = {9},
  pages = {1--32}
}
Floridi, L. The Blackwell guide to the philosophy of computing and information 2004   book URL 
BibTeX:
@book{floridi_blackwell_2004,
  author = {Floridi, Luciano},
  title = {The Blackwell guide to the philosophy of computing and information},
  publisher = {Blackwell Pub},
  year = {2004},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV09T8MwED1RkFAlJGih5qvS_YFEiZ04zUqhqgQDAwNbZdcxYklBhIF_z9lxqhSV8eTIusRnv9yz7xlA8DiJ_qwJ2Zrr3FA46TIx0hJoWMo7Sm1ylct296DHbO_LG3cK0DsGQ3j9qGIAA0ryerPSL8OiIKzzmZgUKeclBW_PKNMgv9M1yl3b7UT3lPk83CzO4NCVIIzgoKrHcNpdvIBhHo5h-NxdQPBzDnc00rjl4fDt-91U2GyQ_uzwY_scbiyufUeEVahqg0Ey1Q3MBUwXDy_zZeQ8WQVGZ6XDe3M-gRPlTsLXja-YMwyOLIVtxRyUMPKawfFr-XQ_Wz7OW3PUmfGXL-uKPxtGyOWjPpJxcQko1SzRlSlMYTPK3oSWViU6tZly4i_CXMFkvzvX_zXcwLA98eKoi9vg47T9vr-XIZRh}
}
Floridi, L. The Blackwell Guide to the Philosophy of Computing and Information 2004 , pp. 40-61  inbook  
BibTeX:
@inbook{Floridi2004,
  author = {Floridi, L.},
  title = {The Blackwell Guide to the Philosophy of Computing and Information},
  publisher = {Blackwell},
  year = {2004},
  pages = {40-61}
}
Brádler, K. and Adami, C. The capacity of black holes to transmit quantum information 2014
Vol. 2014(5), pp. 1-26 
article  
BibTeX:
@article{bradler_capacity_2014,
  author = {Brádler, Kamil and Adami, Christoph},
  title = {The capacity of black holes to transmit quantum information},
  year = {2014},
  volume = {2014},
  number = {5},
  pages = {1--26}
}
Wakakuwa, E. and Murao, M. The chain rule implies Tsirelson's bound: an approach from generalized mutual information 2012 New Journal of Physics
Vol. 14(11), pp. 113037 
article URL 
Abstract: In order to analyze an information theoretical derivation of Tsirelson's bound based on information causality, we introduce a generalized mutual information (GMI), defined as the optimal coding rate of a channel with classical inputs and general probabilistic outputs. In the case where the outputs are quantum, the GMI coincides with the quantum mutual information. In general, the GMI does not necessarily satisfy the chain rule. We prove that Tsirelson's bound can be derived by imposing the chain rule on the GMI. We formulate a principle, which we call the no-supersignaling condition , which states that the assistance of nonlocal correlations does not increase the capability of classical communication. We prove that this condition is equivalent to the no-signaling condition. As a result, we show that Tsirelson's bound is implied by the nonpositivity of the quantitative difference between information causality and no-supersignaling.
BibTeX:
@article{wakakuwa_chain_2012,
  author = {Wakakuwa, Eyuri and Murao, Mio},
  title = {The chain rule implies Tsirelson's bound: an approach from generalized mutual information},
  journal = {New Journal of Physics},
  year = {2012},
  volume = {14},
  number = {11},
  pages = {113037},
  url = {http://stacks.iop.org/1367-2630/14/i=11/a=113037}
}
Wang, Y. The cognitive informatics theory and mathematical models of visual information processing in the brain 2009 International Journal of Cognitive Informatics and Natural Intelligence
Vol. 3(3), pp. 1-11 
article  
BibTeX:
@article{wang_cognitive_2009,
  author = {Wang, Yingxu},
  title = {The cognitive informatics theory and mathematical models of visual information processing in the brain},
  journal = {International Journal of Cognitive Informatics and Natural Intelligence},
  year = {2009},
  volume = {3},
  number = {3},
  pages = {1--11}
}
Szudzik, M.P. The Computable Universe Hypothesis 2010   article  
Abstract: When can a model of a physical system be regarded as computable? We provide the definition of a computable physical model to answer this question. The connection between our definition and Kreisel's notion of a mechanistic theory is discussed, and several examples of computable physical models are given, including models which feature discrete motion, a model which features non-discrete continuous motion, and probabilistic models such as radioactive decay. We show how computable physical models on effective topological spaces can be formulated using the theory of type-two effectivity (TTE). Various common operations on computable physical models are described, such as the operation of coarse-graining and the formation of statistical ensembles. The definition of a computable physical model also allows for a precise formalization of the computable universe hypothesis–the claim that all the laws of physics are computable.
BibTeX:
@article{szudzik_computable_2010,
  author = {Szudzik, Matthew P.},
  title = {The Computable Universe Hypothesis},
  year = {2010}
}
Lloyd, S. The computational universe 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 92-103  incollection  
BibTeX:
@incollection{lloyd_computational_2010,
  author = {Lloyd, Seth},
  title = {The computational universe},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {92--103},
  note = {DOI: 10.1017/CBO9780511778759.005}
}
Smith, J.M. The Concept of Information in Biology 2000 Philosophy of Science
Vol. 67(2), pp. 177-194 
article URL 
Abstract: The use of informational terms is widespread in molecular and developmental biology. In biology, the use of informational terms implies intentionality, in that both the form of the signal, and the response to it, have evolved by selection.;The use of informational terms is widespread in molecular and developmental biology. The usage dates back to Weismann. In both protein synthesis and in later development, genes are symbols, in that there is no necessary connection between their form (sequence) and their effects. The sequence of a gene has been determined, by past natural selection, because of the effects it produces. In biology, the use of informational terms implies intentionality, in that both the form of the signal, and the response to it, have evolved by selection. Where an engineer sees design, a biologist sees natural selection.;
BibTeX:
@article{smith_concept_2000,
  author = {Smith, John M.},
  title = {The Concept of Information in Biology},
  journal = {Philosophy of Science},
  year = {2000},
  volume = {67},
  number = {2},
  pages = {177--194},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV09T8MwED2hTl2AfkGACi9IMARiO06cCaGKipEBZstJbAkpSktbhv577NhpG9QFKVOiKInPd37x3b0HQMljFP6JCSyhSueFNAeJUsVZQZnWmcyZpolksruzDa1woS2ybKoEm5y-gUt5pZ6w8QucPi-_Q6sdZXOsXkjDBGJMosSRvO5SCZhHLhxTHHJidX4OFiAfhl0pYgdkdrVIDtac-Rm05aNtrckuAb1vkT9Si_2vbzmHU49I0YubQgM4UfUQ-u-txMF2CAMfANbo3rNUP4zgzkwwNHNNj2ihke9rsnZGXzVyGpfbMXzOXz9mb6HXXAgLYvwzTElhG_OkwUXGPlzqMuMqjjNd4oLkkU61jrVk5hxXJFIRUXmZZhxjEyZzyRM6gV69qNUlICZLqqjEtKAyVgnhBsqkOiOMKhzjEgdw2xpCLB21hmhS4jwRBrWZ_6AAJtY-wvraZiULkZLY8vnTAEbNUO7va8YxgIvWgKKsKmGxkuWbwXEA486VxCAdy_QTwLRjZ-E9eO3f4Or4g66h71ry7W7MDfQ2qx81dUSOvxaj5Bs}
}
Smith, J.M. The concept of information in biology 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 123-145  incollection  
BibTeX:
@incollection{smith_concept_2010,
  author = {Smith, John Maynard},
  title = {The concept of information in biology},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {123--145},
  note = {DOI: 10.1017/CBO9780511778759.007}
}
Fredkin, E. The Digital Perspective 2003 International Journal of Theoretical Physics
Vol. 42(2), pp. 145-145 
article URL 
BibTeX:
@article{fredkin_digital_2003,
  author = {Fredkin, Ed},
  title = {The Digital Perspective},
  journal = {International Journal of Theoretical Physics},
  year = {2003},
  volume = {42},
  number = {2},
  pages = {145--145},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LT8MwDLbQJCQuvB_jIfXGqVPSpEt7RMDEEcHEtWriBCGkMjH4_9h9ju2ye1w1bfLZjr98BlDJRMRrmGBSJz26TFM6kkmvfObyVCHBJgUkTvn_J9sg-pOM6nPSFShr3F6V5km4Fqm4cQNjMDkqpvS9vL4NortCN0hMKRKFOdmatM-K_UYdtHYvswPoOMYdraSvNQ-34Tdp11u_9iHst3FndNcslCPY8dUx7Nb8T7c8gXNaLtHDxzu3EImehwuYpzCfPc7vn-K2Z0K8pA8bpy7I0opAMIbTEo3AkFupraE_xbovmHuZonYCpbAWMbe8YwNymJGTp1ZnMKq-Kn8BkUIbrEu8tezBpqZ0PuT0FBMM9-uxY-inWCwaZYyCckmjuLOlLtr5FKzPZkyxwDCG2w0Dpsa1Vjopknb05dYjr2CvZtfVNOprGP18__qbRlHxD7TmsYw}
}
French, S. and & Ladyman, J. The Dissolution of Objects: Between Platonism and Phenomenalism. 2003 Synthese
Vol. 136, pp. 73-77 
article  
BibTeX:
@article{French2003,
  author = {French, S., & Ladyman, J},
  title = {The Dissolution of Objects: Between Platonism and Phenomenalism.},
  journal = {Synthese},
  year = {2003},
  volume = {136},
  pages = {73-77}
}
Woodward, P. The emergence of mental content: An essay in the metaphysics of mind 2015   phdthesis  
Abstract: Intentionality is the aboutness or directedness of mental states. According to the most popular theories of intentionality, a mental state’s intentional content is constituted by its causal embeddedness in an organism, vis-à-vis that organism’s environment. I argue that theories of this sort fail to explain how we could know the intentional contents of our mental states. As an alternative to causation-based theories of intentionality, I develop a consciousness-based theory of intentionality, as follows. Phenomenal properties are experiential aspects of consciousness. Among the various types of phenomenal property (sensory, somatic, conative, and so on) are phenomenal-intentional properties , or P-I properties. P-I properties are experiential aspects of consciousness whose natures consist in the presentation to the subject of an intentional content. In perception, imagination and cognition, P-I properties bind together to form modes of presentation of all of the intentional contents we can entertain. Along with the rest of the phenomenal domain, P-I properties emerge from the physical systems on which they depend, but are not reducible to, constituted by or realized in the states of those systems.
BibTeX:
@phdthesis{woodward_emergence_2015,
  author = {Woodward, Philip},
  title = {The emergence of mental content: An essay in the metaphysics of mind},
  year = {2015}
}
D'Agostino, M. and Floridi, L. The Enduring Scandal of Deduction: Is Propositional Logic Really Uninformative? 2009
Vol. 167(2), pp. 271-315 
article URL 
Abstract: Deductive inference is usually regarded as being “tautological” or “analytical”: the information conveyed by the conclusion is contained in the information conveyed by the premises. This idea, however, clashes with the undecidability of first-order logic and with the (likely) intractability of Boolean logic. In this article, we address the problem both from the semantic and the proof-theoretical point of view. We propose a hierarchy of propositional logics that are all tractable (i.e. decidable in polynomial time), although by means of growing computational resources, and converge towards classical propositional logic. The underlying claim is that this hierarchy can be used to represent increasing levels of “depth” or “informativeness” of Boolean reasoning. Special attention is paid to the most basic logic in this hierarchy, the pure “intelim logic”, which satisfies all the requirements of a natural deduction system (allowing both introduction and elimination rules for each logical operator) while admitting of a feasible (quadratic) decision procedure. We argue that this logic is “analytic” in a particularly strict sense, in that it rules out any use of “virtual information”, which is chiefly responsible for the combinatorial explosion of standard classical systems. As a result, analyticity and tractability are reconciled and growing degrees of computational complexity are associated with the depth at which the use of virtual information is allowed.; Deductive inference is usually regarded as being "tautological" or "analytical": the information conveyed by the conclusion is contained in the information conveyed by the premises. This idea, however, clashes with the undecidability of first-order logic and with the (likely) intractability of Boolean logic. In this article, we address the problem both from the semantic and the proof-theoretical point of view. We propose a hierarchy of propositional logics that are all tractable (i.e. decidable in polynomial time), although by means of growing computational resources, and converge towards classical propositional logic. The underlying claim is that this hierarchy can be used to represent increasing levels of "depth" or "informativeness" of Boolean reasoning. Special attention is paid to the most basic logic in this hierarchy, the pure "intelim logic", which satisfies all the requirements of a natural deduction system (allowing both introduction and elimination rules for each logical operator) while admitting of a feasible (quadratic) decision procedure. We argue that this logic is "analytic" in a particularly strict sense, in that it rules out any use of "virtual information", which is chiefly responsible for the combinatorial explosion of standard classical systems. As a result, analyticity and tractability are reconciled and growing degrees of computational complexity are associated with the depth at which the use of virtual information is allowed.; Deductive inference is usually regarded as being "tautological" or "analytical": the information conveyed by the conclusion is contained in the information conveyed by the premises. This idea, however, clashes with the undecidability of first-order logic and with the (likely) intractability of Boolean logic. In this article, we address the problem both from the semantic and the proof-theoretical point of view. We propose a hierarchy of propositional logics that are all tractable (i.e. decidable in polynomial time), although by means of growing computational resources, and converge towards classical propositional logic. The underlying claim is that this hierarchy can be used to represent increasing levels of "depth" or "informativeness" of Boolean reasoning. Special attention is paid to the most basic logic in this hierarchy, the pure "intelim logic", which satisfies all the requirements of a natural deduction system (allowing both introduction and elimination rules for each logical operator) while admitting of a feasible (quadratic) decision procedure. We argue that this logic is "analytic" in a particularly strict sense, in that it rules out any use of "virtual information", which is chiefly responsible for the combinatorial explosion of standard classical systems. As a result, analyticity and tractability are reconciled and growing degrees of computational complexity are associated with the depth at which the use of virtual information is allowed.; Deductive inference is usually regarded as being "tautological" or "analytical": the information conveyed by the conclusion is contained in the information conveyed by the premises. This idea, however, clashes with the undecidability of first-order logic and with the (likely) intractability of Boolean logic. In this article, we address the problem both from the semantic and the proof-theoretical point of view. We propose a hierarchy of propositional logics that are all tractable (i.e. decidable in polynomial time), although by means of growing computational resources, and converge towards classical propositional logic. The underlying claim is that this hierarchy can be used to represent increasing levels of "depth" or "informativeness" of Boolean reasoning. Special attention is paid to the most basic logic in this hierarchy, the pure "intelim logic", which satisfies all the requirements of a natural deduction system (allowing both introduction and elimination rules for each logical operator) while admitting of a feasible (quadratic) decision procedure. We argue that this logic is "analytic" in a particularly strict sense, in that it rules out any use of "virtual information", which is chiefly responsible for the combinatorial explosion of standard classical systems. As a result, analyticity and tractability are reconciled and growing degrees of computational complexity are associated with the depth at which the use of virtual information is allowed.
BibTeX:
@article{dagostino_enduring_2009,
  author = {D'Agostino, Marcello and Floridi, Luciano},
  title = {The Enduring Scandal of Deduction: Is Propositional Logic Really Uninformative?},
  year = {2009},
  volume = {167},
  number = {2},
  pages = {271--315},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9QwEB4hkFAlBGyhITwkX0CAyDZ2snHCpUJ9qAgkKh4SNyvx4xQt2yZF9N8zYzu7tIUD2lO0duQ4X8Zjz8z3ARRinmdXbILgpsGFwglbOJkbUeGq2ErXGdKV1p5X-4-T7fVJBiVZ-ixBH9NHd6nr7S7ueCRHx2RvdZqRehRFWaOUBppioigLUubrYALuCCI7Y5PJeiGn4KavoONCNBnlADQl6a1dWp6ikQ6JipddUKoZOR-uhU_9qnR0D6Y0uCkbZR2i3hTR_yVb-z-f9j7cjV4rexdgNoMbdrkNWyeTDMLFNsyikRjYy8hk_eoBfEIQssNlKIVkXzSdWvTsh2MHRBhLkHjL3g_shJQaQu4Y_k3qz5p9Rge2v2DoEi-jW_3T7j2Eb0eHX_ePs6jgkGkueZPVmspPdFd1zi7wZ0RZm85w4sHDrVvliB3e6QKb5J0py86JLufaLLQsq1LzYgfutJTpvxx9RaBJ4JbDz9ImtFQmON0J3P7efDyojz_sh8vZdDkffNna_HRMEAh-wrJqLh8BI9wVNkfTimtzXZtGa120eeOsXbTS2hReTxhQq8D5oTbszgQYRdqdBBhVppAQShTNxnjWakWZ9YVEtzOFHf821_eYXiV2mZCkTN8rxCfxQKF5TOF5ANa6j1CDULmqiEaI0965VOOvEe9wpR1Fo6mMOIU3EyI3Y_dD9nqjESdh7CvjUnhxrTnFy2IfXkkcATV-_K-HeQJbIcBGaXlP4eZ4dm6fBU7L38XJLmk}
}
Gray, R.M. The Entropy Ergodic Theorem 2011 Entropy and Information Theory, pp. 97-115  incollection  
BibTeX:
@incollection{gray_entropy_2011-2,
  author = {Gray, Robert M.},
  title = {The Entropy Ergodic Theorem},
  booktitle = {Entropy and Information Theory},
  publisher = {Springer US},
  year = {2011},
  pages = {97--115}
}
Lerner, V.S. The entropy functional, the information path functional's essentials and their connections to Kolmogorov's entropy, complexity and physics 2011   article  
Abstract: The paper introduces the recent results related to an entropy functional on trajectories of a controlled diffusion process, and the information path functional (IPF), analyzing their connections to the Kolmogorov's entropy, complexity and the Lyapunov's characteristics. Considering the IPF's essentials and specifics, the paper studies the singularities of the IPF extremal equations and the created invariant relations, which both are useful for the solution of important mathematical and applied problems. Keywords: Additive functional; Entropy; Singularities, Natural Border Problem; Invariant
BibTeX:
@article{lerner_entropy_2011,
  author = {Lerner, Vladimir S.},
  title = {The entropy functional, the information path functional's essentials and their connections to Kolmogorov's entropy, complexity and physics},
  year = {2011}
}
Floridi, L. The ethics of information 2014   book URL 
BibTeX:
@book{floridi_ethics_2014,
  author = {Floridi, Luciano},
  title = {The ethics of information},
  publisher = {Oxford University Press},
  year = {2014},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV07T8MwED4BlVAlBgo0FKiUP5Dg2G7tYy1UlWBgYGCr4toZG0HDwL_nLg9QCownS7Zln_2d7_EZQMlUJHt3gjBovXBocKaDEqHYEOwE62zGdCPo-p5t6Pz7XTLHd4T3pwb9N2vifvQf1W2-4Z-v78pdyUyygjNu6dKWGccfmLmQnk2HtneeSQ0JUC09krAh4MnMnGOFLVdPJ9sebV-NRctTGAQuUBjBQdiewfC5-4vg8xwmtOlxqDPY47KIW1JUXvoLmC4fXharhLtbtz6btdNoDZL1pMZwknOu-7aqa-J8BIOCFDNEDBYRDR3B8Ss-3dvV46IRR52Y7urCrfStigibar1O5qm5hBhnISu0k6bQuXYmw-ClV1Jpsg6N9HIC47-nc_VfwzUMyZTQjXPiBo6q948wbdboC4evi38}
}
Colyvan, M. and & Lyon, A. The Explanatory Power of Phase Spaces 2008 Philosophia Mathematica
Vol. 16(2), pp. 227-243 
article  
BibTeX:
@article{Colyvan2008a,
  author = {Colyvan, M., & Lyon, A.},
  title = {The Explanatory Power of Phase Spaces},
  journal = {Philosophia Mathematica},
  year = {2008},
  volume = {16},
  number = {2},
  pages = {227-243}
}
Donaldson-Matasci, M.C., Bergstrom, C.T. and Lachmann, M. The fitness value of information 2010 Oikos
Vol. 119(2), pp. 219-230 
article  
BibTeX:
@article{donaldson-matasci_fitness_2010,
  author = {Donaldson-Matasci, Matina C. and Bergstrom, Carl T. and Lachmann, Michael},
  title = {The fitness value of information},
  journal = {Oikos},
  year = {2010},
  volume = {119},
  number = {2},
  pages = {219--230}
}
Bennett, C.H. and Landauer, R. The Fundamental Physical Limits of Computation 1985 Scientific American
Vol. 253(1), pp. 48 
article URL 
Abstract: Examines what constraints govern the physical process of computation, considering such areas as whether a minimum amount of energy is required per logic step. Indicates that although there seems to be no minimum, answers to other questions are unresolved. Examples used include DNA/RNA, a Brownian clockwork turning machine, and others. (JN)
BibTeX:
@article{bennett_fundamental_1985,
  author = {Bennett, Charles H. and Landauer, Rolf},
  title = {The Fundamental Physical Limits of Computation},
  journal = {Scientific American},
  year = {1985},
  volume = {253},
  number = {1},
  pages = {48},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwY2AwNtIz0EUrEwxMgeko0SjVMM3c0jLVEthIMEgyNk9NMTdNNEuzBE_wIo1sk7XI0gt05Y6FKTMDMzAZQvpB4P3mEGHocmakusNNkIEf2uhTcITEkhADU2qeCIMeMIYU3EAbMSDn6ysEQINMAbzpqFghP00BcucCOPBEGWTcXEOcPXRBFsQXQA6JiIdZbCTGwJsIWqyeVwLe1JYizsCaBkxZqeKg0l4c6BZxBo4ISx8XCw9vZwhXCMbVKwbvvNIrLBEHVi7ghKlrpmcuwaCQkgRsN5iZpaQCa3sTiyTjRMtEU7O0NCPzZGAPBViISTKIYnWNFA5xaQYuQ0sLU8jYggzUhbKQowkBWrF91g}
}
Chaitin, G. The Halting Probability Omega: Irreducible Complexity in Pure Mathematics 2007 Milan j. math.
Vol. 75, pp. 291–304 
article  
BibTeX:
@article{Chaitin2007,
  author = {Chaitin, G.},
  title = {The Halting Probability Omega: Irreducible Complexity in Pure Mathematics},
  journal = {Milan j. math.},
  year = {2007},
  volume = {75},
  pages = {291–304}
}
Smith, J.M. The Idea of Information in Biology 1999 The Quarterly Review of Biology
Vol. 74(4), pp. 395-400 
article URL 
Abstract: The idea of information in biology is discussed. There has been, in the course of evolution, a series of changes in the way in which information is stored and transmitted. However, philosophers of biology have either ignored the concept of information or have argued that it is irrelevant or misleading. The manner in which the concept has been applied in genetics, evolution, and developmental biology is considered.;Smith discusses how the idea of Information Theory has been applied in genetics, evolution and developmental biology. He is not, he concedes, competent to discuss Information Theory in neurobiology.;
BibTeX:
@article{smith_idea_1999,
  author = {Smith, John M.},
  title = {The Idea of Information in Biology},
  journal = {The Quarterly Review of Biology},
  year = {1999},
  volume = {74},
  number = {4},
  pages = {395--400},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8QwEB5kQfDicxfrA4P3ats0bXISERe96znkCcLSXdf14L930qbr1sfBU2loaZtJJl8638wHQIurLP3mE5Syha5crerSGsVVRg0Ce2UV5x7N5IZ_tqEXdQwky5Yl2Mb0ES7pmbvGJSV41JvFaxrEo0KQNSppoCdGvEw7Yt9m6d1YnJGmrA6qxBsrUPTDHRdxgDKHYiQbi850D3peQE82WUegv3LkfyFj_-9j9mE3YlJy2w2iA9hyzSFsdyqVH0dwiUOJPFqnyNyTmL4UzEleGhIvGsPz9P7p7iGN0gqpKUJCjs09FdprljvDNR4y7WvrmbDc80wo5ag3rMBTZzy3jBrHuKOuspXiDOf0BEbNvHHHQEQAFdR4K9CQKqu4UzUvLcI0y4XOfQIXfXfLRVdBQ7aRb15JKso8EwlMghVkmFKrpTISIR3uW7M8gXHbYev7Ym9he28naWczWdZhW1b-bBch7zZnCZwPjCnjPH2LL3Dyx3NOYaet0tCyV85gtFq-u_OuXuMnre3bYA}
}
Denning, P.J. and Bell, T. The Information Paradox 2012 American Scientist
Vol. 100(6), pp. 470 
article URL 
Abstract: [...]the concept of information seems fuzzy and abstract to many people, making it hard for them to understand how information systems really work. [...]all the components are physical, and information is always encoded into some sort of signal, which can be transmitted and translated without losing the information it encodes (see Figure 3). Because information is always represented by physical means, it takes time and energy to read, write and transform it. Bayesian inference programs are extensively used in data mining - they can infer complex hypotheses using the evidence in very large data sets.; [...]the concept of information seems fuzzy and abstract to many people, making it hard for them to understand how information systems really work. [...]all the components are physical, and information is always encoded into some sort of signal, which can be transmitted and translated without losing the information it encodes (see Figure 3). Because information is always represented by physical means, it takes time and energy to read, write and transform it. Bayesian inference programs are extensively used in data mining - they can infer complex hypotheses using the evidence in very large data sets.
BibTeX:
@article{denning_information_2012,
  author = {Denning, Peter J. and Bell, Tim},
  title = {The Information Paradox},
  journal = {American Scientist},
  year = {2012},
  volume = {100},
  number = {6},
  pages = {470},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV09T8MwED0hJiQEtEApUCkjDIHEdp3chABRMTJ0YKv8cZ6qtrRB4udjJ04qtWJhdobIH-_e3b27A-DsIUt3MIF5Xq-0Q8ecdyi8VbHCUE1WXaZysRPZbor6Q6SgOe0WJGvktksTguaPuTe8RWgHz55WX2kYIxXSrXGmhsdkj81B4ld8vnRZhZAy6iboIco9_K2NyuQU2sxQKybpMszbGvh9sfW_f_YMTiL7TJ6b69KDA1r0w-DmKPLoQy–9U1yFxtS35_DwN-lJNYthc-SD7VWdvlzAdPJ2_T1PY0DFdKVt9Ipo1w7Kwv_ykurMiV16ErMPYewiCxDhxINocuFNsjFmLSkTAnKZcHJSeKXcKyC7n5R1fV59goSx4vC2eDIEBdkEIWjzBhJVpflWOIQRu2GzJQO0RtTbWbb7RjCoFu38_msbsfvvRyeX_-5cgNHnrawpiLwFg6r9TeNmjaKv1uJuVg}
}
Mathur, S.D. The information paradox: a pedagogical introduction 2009 Classical and Quantum Gravity
Vol. 26(22), pp. 224001 
article URL 
Abstract: The black hole information paradox is a very poorly understood problem. It is often believed that Hawking's argument is not precisely formulated, and a more careful accounting of naturally occurring quantum corrections will allow the radiation process to become unitary. We show that such is not the case, by proving that small corrections to the leading order Hawking computation cannot remove the entanglement between the radiation and the hole. We formulate Hawking's argument as a 'theorem': assuming 'traditional' physics at the horizon and usual assumptions of locality we will be forced into mixed states or remnants. We also argue that one cannot explain away the problem by invoking AdS/CFT duality. We conclude with recent results on the quantum physics of black holes which show that the interior of black holes have a 'fuzzball' structure. This nontrivial structure of microstates resolves the information paradox and gives a qualitative picture of how classical intuition can break down in black hole physics.
BibTeX:
@article{mathur_information_2009,
  author = {Mathur, Samir D.},
  title = {The information paradox: a pedagogical introduction},
  journal = {Classical and Quantum Gravity},
  year = {2009},
  volume = {26},
  number = {22},
  pages = {224001},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1LS8QwEB5EEBTxLVsf0IN6ELrbpDFpvMmieNGTnkOapOJlt7hd2J_vJOmuy6qgp_YwfWQSZubLzHwBKGg_z1ZsgudNN4QZR0RZ01xSK5kpuPHcJYQEYqalnW2Yn4n4Pm46R9DH25DYR8zAMol-ZkD5gNJBKIIM8AeRQujfel5kEbz3inss8ZF5hzCCvp9fgw4Gv-NZQ824mU5-dEbB8Tzswrz6eF5wsshCf_XJfy_I_vOA9mCni0jTu7iE9mHNjQ5g62lB5zo5gI1QJ2omh1Dgsko7slU_pamnDrfj2W2q08ZZ_RYtKYq0kUoWZY7g9eH-ZfiYdecuZJpwgQG3IbWRxImq4nlV5qUhUjOuuRNo2WVuNBUSDSXOJ6_LStdEaCeYNcWN1Rwvx7CtfX3-qA19fLYHqWCuKGxdOX-uR82ELHVZeXiqpZD2RiZwhQpRTaTYUF4pyitFUa4oVVEpqrF1AtfLgiGHXpa_PJDAZZzJhTxVE6pyJXLGpMBYRkrVztoEeityGIYhsGMFS-BieQ18_aEH9L6qNSRIaALkL2LDjnjdEw60J_8ZyilshkQWIRklZ7DefkzdeeSO_ARSAPgg}
}
Floridi, L. The Information Society and Its Philosophy: Introduction to the Special Issue on "The Philosophy of Information, Its Nature, and Future Developments" 2009 The Information Society
Vol. 25(3), pp. 153-158 
article URL 
Abstract: The article introduces the special issue dedicated to "The Philosophy of Information, Its Nature, and Future Developments." It outlines the origins of the information society and then briefly discusses the definition of the philosophy of information, the possibility of reconciling nature and technology, the informational turn as a fourth revolution (after Copernicus, Darwin, and Freud), and the metaphysics of the infosphere.; The article introduces the special issue dedicated to "The Philosophy of Information, Its Nature, and Future Developments." It outlines the origins of the information society and then briefly discusses the definition of the philosophy of information, the possibility of reconciling nature and technology, the informational turn as a fourth revolution (after Copernicus, Darwin, and Freud), and the metaphysics of the infosphere. Adapted from the source document.; The article introduces the special issue dedicated to The Philosophy of Information, Its Nature, and Future Developments. It outlines the origins of the information society and then briefly discusses the definition of the philosophy of information, the possibility of reconciling nature and technology, the informational turn as a fourth revolution (after Copernicus, Darwin, and Freud), and the metaphysics of the infosphere.
BibTeX:
@article{floridi_information_2009,
  author = {Floridi, Luciano},
  title = {The Information Society and Its Philosophy: Introduction to the Special Issue on "The Philosophy of Information, Its Nature, and Future Developments"},
  journal = {The Information Society},
  year = {2009},
  volume = {25},
  number = {3},
  pages = {153--158},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3fi9QwEB5EQRRR79RePQ_CoT7IdW2a_kh9k9VlRRHBE3wrbZI-3p02B_qH-P86kzS73XU55B7bTpOWTDJfMjPfAIhsliZbawIabZnWussUwvsWIQov0bQZXSpZZNpXB1qfbEOoskhBlrSj7j1thFu5aaq33RDi416nVDgL7RLRkOSykI75U1QU3nW6WK49CoX3NqN0guIieDh3tbBhozYYTDdxKCWOXA47TZYzT4sHEFzVISxl5ateZ9PvCNu-zm8_hPsjhmVvvdLtwQ1ztg93J8yG-3A05kOwl2xMeCIFYONK8gj-oHpuPvHRowy_h32wA_sSSiz8foNy1vPSkqA9Z4hZ2dcLQ4f9zFUPZHj_mFpcv8XO-2n7J67Rz47F9MR1snBkKmwSNDUcP4Zvi_en82UyVohIFCITnqi8JMTLK42ba51zJWq8L0Re9SprVVepvO-JiplLo_G61JL30qSV1rh1E614AvdayiQ4sy7jUEdwq8dpbyIyxRGOYgS3v9ef3snlx7m_3AuXs8Glxc1-2Ah1zK0aSTmrDoAVhSpqzXUmNW5f07YzuM7pThrZVloaEcOroF7NhecUaXigat0e1hjSqQI21h3k9L7qyr_ijf1lY8iveOWKng6DbjeoaZay9IYGEW6JOC6P4YVX99UnZ82QNWlTpXlOML2sS9_5wZacIA46hPFZDM-nE2UlQFa5LApJtRJwMGPg_yM2HznsibvBPr3eLx_CHe8dLBKePYOb9uelOfKEnH8Bpjpo7A}
}
English, S., Pen, I., Shea, N. and Uller, T. The Information Value of Non-Genetic Inheritance in Plants and Animals 2015 PLOS ONE
Vol. 10(1), pp. e0116996 
article URL 
Abstract: Parents influence the development of their offspring in many ways beyond the transmission of DNA. This includes transfer of epigenetic states, nutrients, antibodies and hormones, and behavioural interactions after birth. While the evolutionary consequences of such non-genetic inheritance are increasingly well understood, less is known about how inheritance mechanisms evolve. Here, we present a simple but versatile model to explore the adaptive evolution of non-genetic inheritance. Our model is based on a switch mechanism that produces alternative phenotypes in response to different inputs, including genes and non-genetic factors transmitted from parents and the environment experienced during development. This framework shows how genetic and non-genetic inheritance mechanisms and environmental conditions can act as cues by carrying correlational information about future selective conditions. Differential use of these cues is manifested as different degrees of genetic, parental or environmental morph determination. We use this framework to evaluate the conditions favouring non-genetic inheritance, as opposed to genetic determination of phenotype or within-generation plasticity, by applying it to two putative examples of adaptive non-genetic inheritance: maternal effects on seed germination in plants and transgenerational phase shift in desert locusts. Our simulation models show how the adaptive value of non-genetic inheritance depends on its mechanism, the pace of environmental change, and life history characteristics.; Parents influence the development of their offspring in many ways beyond the transmission of DNA. This includes transfer of epigenetic states, nutrients, antibodies and hormones, and behavioural interactions after birth. While the evolutionary consequences of such non-genetic inheritance are increasingly well understood, less is known about how inheritance mechanisms evolve. Here, we present a simple but versatile model to explore the adaptive evolution of non-genetic inheritance. Our model is based on a switch mechanism that produces alternative phenotypes in response to different inputs, including genes and non-genetic factors transmitted from parents and the environment experienced during development. This framework shows how genetic and non-genetic inheritance mechanisms and environmental conditions can act as cues by carrying correlational information about future selective conditions. Differential use of these cues is manifested as different degrees of genetic, parental or environmental morph determination. We use this framework to evaluate the conditions favouring non-genetic inheritance, as opposed to genetic determination of phenotype or within-generation plasticity, by applying it to two putative examples of adaptive non-genetic inheritance: maternal effects on seed germination in plants and transgenerational phase shift in desert locusts. Our simulation models show how the adaptive value of non-genetic inheritance depends on its mechanism, the pace of environmental change, and life history characteristics.
BibTeX:
@article{english_information_2015,
  author = {English, S. and Pen, I. and Shea, N. and Uller, T.},
  title = {The Information Value of Non-Genetic Inheritance in Plants and Animals},
  journal = {PLOS ONE},
  year = {2015},
  volume = {10},
  number = {1},
  pages = {e0116996},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V3Pa9swFBZjG2MwRrMfbroOdNhhY8jYViLLhx1G2lLYDoMmsFuQJRkCSZwu7v_f9yRZcboNdt4tiuXY0Sc9fe9J7xMhvEgz9sAmSFVzI1QjRKVrYWAO4kU9UbWCTlDXXD2IbPcH7B2–1-AD3KoDluU83aBAXD0GdyFWYtQAfP-OpcvgPvI1247jBduXW3Uej8krT_W7c3n9rD-HlJ_XeAUSKoyB_vq88hMGwM3YOtDh0MnOhL4RZ-BOG_rlTqKPfi8y9R6SwnEj4ki40emNPuty3i7aHG9p_Jn18ZW322g2ZFy8bzIDvNR3CUYL6EK-sasdPfFbtniBgUDJrnsQzN-2gXDLfA8qv6mI54RycVTzPW52_-RZThGMT8hL4MrQL96CEfkkd2-IqNgbPf0Y1AE__SaXAGmdIApdZjStqEDTOkAU_hMPaYUMKUB0zdkcXU5n12zcAAG00AbMia5lRr86UY3MEx4YbiagsG0UK6BhxouKomCbKXJdKFQbE5ZYad8KoGll8Lwt-SFwkSJbecSKk1CnjTwOJsg00jgHyfk2c_q-4W8_jbzxVFfTPcu6y-97RJoYTc2mEjLU0JLWeXSTIpcV-C8VrnS8F7W6MYoY3PbjAnrW36585IpS7duWoKr6Zt2uYNOuwx9YkwSD0-s3WM4Jqcer3hFZOBAgAeej8mHIYKxghNcAlbL8aCPCfxC_i_VZkEPH3UgurO_vs878hxHgY_AnZPH3a87-97rd94DERmLKg}
}
Floridi, L. The Informational Nature of Personal Identity 2011 Minds and Machines
Vol. 21(4), pp. 549-566 
article URL 
Abstract: In this paper, I present an informational approach to the nature of personal identity. In “Plato and the problem of the chariot”, I use Plato’s famous metaphor of the chariot to introduce a specific problem regarding the nature of the self as an informational multiagent system: what keeps the self together as a whole and coherent unity? In “Egology and its two branches” and “Egology as synchronic individualisation”, I outline two branches of the theory of the self: one concerning the individualisation of the self as an entity, the other concerning the identification of such entity. I argue that both presuppose an informational approach, defend the view that the individualisation of the self is logically prior to its identification, and suggest that such individualisation can be provided in informational terms. Hence, in “A reconciling hypothesis: the three membranes model”, I offer an informational individualisation of the self, based on a tripartite model, which can help to solve the problem of the chariot. Once this model of the self is outlined, in “ICTs as technologies of the self” I use it to show how ICTs may be interpreted as technologies of the self. In “The logic of realisation”, I introduce the concept of “realization” (Aristotle’s anagnorisis) and support the rather Spinozian view according to which, from the perspective of informational structural realism, selves are the final stage in the development of informational structures. The final “Conclusion: from the egology to the ecology of the self” briefly concludes the article with a reference to the purposeful shaping of the self, in a shift from egology to ecology.; In this paper, I present an informational approach to the nature of personal identity. In "Plato and the problem of the chariot”, I use Plato's famous metaphor of the chariot to introduce a specific problem regarding the nature of the self as an informational multiagent system: what keeps the self together as a whole and coherent unity? In "Egology and its two branches” and "Egology as synchronic individualisation”, I outline two branches of the theory of the self: one concerning the individualisation of the self as an entity, the other concerning the identification of such entity. I argue that both presuppose an informational approach, defend the view that the individualisation of the self is logically prior to its identification, and suggest that such individualisation can be provided in informational terms. Hence, in "A reconciling hypothesis: the three membranes model”, I offer an informational individualisation of the self, based on a tripartite model, which can help to solve the problem of the chariot. Once this model of the self is outlined, in "ICTs as technologies of the self” I use it to show how ICTs may be interpreted as technologies of the self. In "The logic of realisation”, I introduce the concept of "realization” (Aristotle's anagnorisis) and support the rather Spinozian view according to which, from the perspective of informational structural realism, selves are the final stage in the development of informational structures. The final "Conclusion: from the egology to the ecology of the self” briefly concludes the article with a reference to the purposeful shaping of the self, in a shift from egology to ecology.
BibTeX:
@article{floridi_informational_2011,
  author = {Floridi, Luciano},
  title = {The Informational Nature of Personal Identity},
  journal = {Minds and Machines},
  year = {2011},
  volume = {21},
  number = {4},
  pages = {549--566},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fa9swED7KVsagLO1-eFlb8MP6MiZjS7ZsP460odAyxtqOvQnZsl4GWZc40P33u5NlJ0vzkIJfjC-yopN0393pPgMIHsVsY09AqxcbI4m9zlBxiDGxzRCt6ipNNHfx_bXINvAhkjH7FfUJSrdvr0rfiHOAUYCvRATPiHQbTRXN8e83P4Y8Al2ObY-nTKIv0Oc1t7Xwn2Xy-_M-1YYsF2tp0q0Gyhmj6Qj6Ipn-EMqQmV7Vzj8-pP2EP3kIrzxODb90E-sI9prZaxj134AI_ZbwBhjOs9DXNHVhxfCr4woNf9vwm0f6oS8H_vsW7qYXt5NL5j_BwGqEgpLxKhW2rtIK_chCNJq4X7RsiFRM1GgArUZwbk2cWPScqtIIbU1eiTxtcN8yRSHewYGmo_qz1pX0mQCeW1xXTUC2LsCBC-DFz_L6vLi8mnS3R_1ttHB1Z9GfNkB1umXJZJS_h9BmBnEOz02O4BX9R13WpjG25LriQufNGD71mlT3HWmHWtEz05gqHFNFY6rkGALStaIF3c51rbhETFyKDJ-cdeofGuFqwVWsCmLUkSUR8aj2ocUWNuSyRBIWFmP43Ct21RH3fsJdyuuw68i9sfTCTXEn6H-TqNTJjuHj-hwcZJ2DS4lll7TB_ie7iE08GTyRILQfduzCMbx00XVXlXkCz9r5sjntqCz_ATyDItk}
}
Floridi, L. The Latent Nature of Global Information Warfare 2014
Vol. 27(3), pp. 317-319 
article  
Abstract: Issue Title: Trends in the History and Philosophy of Computing
BibTeX:
@article{floridi_latent_2014,
  author = {Floridi, Luciano},
  title = {The Latent Nature of Global Information Warfare},
  year = {2014},
  volume = {27},
  number = {3},
  pages = {317--319}
}
Allo, P. The logic of 'being informed' revisited and revised 2011 Philosophical Studies: An International Journal for Philosophy in the Analytic Tradition
Vol. 153(3), pp. 417-434 
article  
Abstract: The logic of 'being informed' gives a formal analysis of a cognitive state that does not coincide with either belief, or knowledge. To Floridi, who first proposed the formal analysis, the latter is supported by the fact that unlike knowledge or belief, being informed is a factive, but not a reflective state. This paper takes a closer look at the formal analysis itself, provides a pure and an applied semantics for the logic of being informed, and tries to find out to what extent the formal analysis can contribute to an information-based epistemology.;The logic of ‘being informed’ gives a formal analysis of a cognitive state that does not coincide with either belief, or knowledge. To Floridi, who first proposed the formal analysis, the latter is supported by the fact that unlike knowledge or belief, being informed is a factive, but not a reflective state. This paper takes a closer look at the formal analysis itself, provides a pure and an applied semantics for the logic of being informed, and tries to find out to what extent the formal analysis can contribute to an information-based epistemology.; The logic of 'being informed' gives a formal analysis of a cognitive state that does not coincide with either belief, or knowledge. To Floridi, who first proposed the formal analysis, the latter is supported by the fact that unlike knowledge or belief, being informed is a factive, but not a reflective state. This paper takes a closer look at the formal analysis itself, provides a pure and an applied semantics for the logic of being informed, and tries to find out to what extent the formal analysis can contribute to an information-based epistemology.[PUBLICATION ABSTRACT];The logic of 'being informed' gives a formal analysis of a cognitive state that does not coincide with either belief, or knowledge. To Floridi, who first proposed the formal analysis, the latter is supported by the fact that unlike knowledge or belief, being informed is a factive, but not a reflective state. This paper takes a closer look at the formal analysis itself, provides a pure and an applied semantics for the logic of being informed, and tries to find out to what extent the formal analysis can contribute to an information-based epistemology.;The logic of 'being informed' gives a formal analysis of a cognitive state that does not coincide with either belief, or knowledge. To Floridi, who first proposed the formal analysis, the latter is supported by the fact that unlike knowledge or belief, being informed is a factive, but not a reflective state. This paper takes a closer look at the formal analysis itself, provides a pure and an applied semantics for the logic of being informed, and tries to find out to what extent the formal analysis can contribute to an information-based epistemology.;
BibTeX:
@article{allo_logic_2011,
  author = {Allo, Patrick},
  title = {The logic of 'being informed' revisited and revised},
  journal = {Philosophical Studies: An International Journal for Philosophy in the Analytic Tradition},
  year = {2011},
  volume = {153},
  number = {3},
  pages = {417--434}
}
D’Alfonso, S. The Logic of Knowledge and the Flow of Information 2014 Minds and Machines
Vol. 24(3), pp. 307-325 
article URL 
Abstract: In this paper I look at Fred Dretske's account of information and knowledge as developed in Knowledge and The Flow of Information. In particular, I translate Dretske's probabilistic definition of information to a modal logical framework and subsequently use this to explicate the conception of information and its flow which is central to his account, including the notions of channel conditions and relevant alternatives. Some key products of this task are an analysis of the issue of information closure and an investigation into some of the logical properties of Dretske's account of information flow.; In this paper I look at Fred Dretske's account of information and knowledge as developed in Knowledge and The Flow of Information. In particular, I translate Dretske's probabilistic definition of information to a modal logical framework and subsequently use this to explicate the conception of information and its flow which is central to his account, including the notions of channel conditions and relevant alternatives. Some key products of this task are an analysis of the issue of information closure and an investigation into some of the logical properties of Dretske's account of information flow.; In this paper I look at Fred Dretske’s account of information and knowledge as developed in Knowledge and The Flow of Information. In particular, I translate Dretske’s probabilistic definition of information to a modal logical framework and subsequently use this to explicate the conception of information and its flow which is central to his account, including the notions of channel conditions and relevant alternatives. Some key products of this task are an analysis of the issue of information closure and an investigation into some of the logical properties of Dretske’s account of information flow.
BibTeX:
@article{dalfonso_logic_2014,
  author = {D’Alfonso, Simon},
  title = {The Logic of Knowledge and the Flow of Information},
  journal = {Minds and Machines},
  year = {2014},
  volume = {24},
  number = {3},
  pages = {307--325},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1bS-UwEB7EFRGW9V7PrkIe9EVoaZO0TR-XowdBn7zhW0jT5EU5Xk5l_fnO9Ha8PbhQaNOGaS6TzCST-QZA8CgOP8wJppTS2CpWMjW5S1FNtzZRlazS2AnD_fudbeDDTsb0NuoNlM28PXd9I8yBkGITFKihhKRGoqgiHj-_uB7sCHQ1aHtchhmuBXq75lcU3kmmbn5eIt-Q59kbM-mXAqoRRpNV6J1k-kMog2V67jv_-ZD2f1RyDX51eir72zLWOiy46Qas9jEgWDclbAJHPmMUsNmye89O-x06ZqYVQ92STe7u_9GXzu2J2GALribHl-OTsIvDEFrUB0VovHGc21hUqF4o7wrpPDeFKIVNbUb7qNxbH-fG4M26rEjK3EpXEpSccjYR2_DT0Hn9ad349VUB_PA4uFxAAi_A1gtg-aY4O1Inp-M2ud4no1njfBY91gH2aTM2wyzKd4ApJWwpyrxUuELF5wJ_lmbeyNLGZFwewWHfnfqhRe7Qc4xmaliNRdfUsPplBAF1uKZRXT8Zq0WeK8LOT0Zw0PLAQITrGdexVrjUk1kTwU3XLzVS-JAv5Vj_IhdEoevdOQ1cmeq2EFzqthQj2H_LWUPeBkOIwg0qUi6QXPKdbOMO4p2gDerf3yzCH1ih1-0Bx11YrJ-e3V4LUPkK9hcW4g}
}
Nahin, P.J. The logician and the engineer: how George Boole and Claude Shannon created the information age 2012   book URL 
BibTeX:
@book{nahin_logician_2012,
  author = {Nahin, Paul J.},
  title = {The logician and the engineer: how George Boole and Claude Shannon created the information age},
  publisher = {Princeton University Press},
  year = {2012},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV1Lb8IwDLYGu2xCYoMh9qiUP8CjdZq2ZzS0CzfuKKHuYUKVgCLx83HSB-pgx8hSosTyI87nLwAYTOeTPz5BJio2FnKhQpSExkejpUYVJ7HmEE7tyjbUcPoazNG88F570G9ZEy0i00EKHQCAcyuzoxrqPPudbotin4TRsQOduGXCzmdHyE46cJ1flmJeqjBqCKHKsWwx9bnws3yBR7I9Ca_wQPkA-vVPDKIyzAE8rxr21eMQPNa9cD6NdS90ngoWCqqYB9_AW36vFz8Tu8ymKt9sDF9hOAeKzjiCnraw97xw7XHpGITvU7YlqQNExeeaGZVmnAGxWoyvQ03vMLo_2cd_gk944pwgKKsMX9AtDifyyp1fAJkmhHo}
}
Weaver, W.N. The Mathematical Theory of Communication 1998   inbook  
BibTeX:
@inbook{Weaver1998,
  author = {Weaver, W. N.},
  title = {The Mathematical Theory of Communication},
  publisher = {University of Illinois Press},
  year = {1998}
}
Calude, C.S. The mathematical theory of information 2007 The Mathematical Intelligencer
Vol. 29(1), pp. 64-65 
article  
BibTeX:
@article{calude_mathematical_2007,
  author = {Calude, Cristian S.},
  title = {The mathematical theory of information},
  journal = {The Mathematical Intelligencer},
  year = {2007},
  volume = {29},
  number = {1},
  pages = {64--65}
}
Kahre, J. The Mathematical Theory of Information 2002 (684)  book  
BibTeX:
@book{Krahre2002,
  author = {Kahre, J.},
  title = {The Mathematical Theory of Information},
  publisher = {Kluwer},
  year = {2002},
  number = {684}
}
Dretske, F. The metaphysics of information 2008   book URL 
BibTeX:
@book{dretske_metaphysics_2008,
  author = {Dretske, Fred},
  title = {The metaphysics of information},
  publisher = {na},
  year = {2008},
  url = {http://wittgensteinrepository.org/agora-ontos/article/viewFile/2065/2273}
}
Ellis, B. The Metaphysics of Scientific Realism 2014   book  
BibTeX:
@book{ellis_metaphysics_2014,
  author = {Ellis, Brian},
  title = {The Metaphysics of Scientific Realism},
  publisher = {Taylor and Francis},
  year = {2014}
}
Floridi, L. The Method of Levels of Abstraction 2008 Minds and Machines
Vol. 18(3), pp. 303-329 
article URL 
Abstract: The use of “levels of abstraction” in philosophical analysis (levelism) has recently come under attack. In this paper, I argue that a refined version of epistemological levelism should be retained as a fundamental method, called the method of levels of abstraction. After a brief introduction, in section “Some Definitions and Preliminary Examples” the nature and applicability of the epistemological method of levels of abstraction is clarified. In section “A Classic Application of the Method of Abstraction”, the philosophical fruitfulness of the new method is shown by using Kant’s classic discussion of the “antinomies of pure reason” as an example. In section “The Philosophy of the Method of Abstraction”, the method is further specified and supported by distinguishing it from three other forms of “levelism”: (i) levels of organisation; (ii) levels of explanation and (iii) conceptual schemes. In that context, the problems of relativism and antirealism are also briefly addressed. The conclusion discusses some of the work that lies ahead, two potential limitations of the method and some results that have already been obtained by applying the method to some long-standing philosophical problems.; The use of "levels of abstraction" in philosophical analysis (levelism) has recently come under attack. In this paper, I argue that a refined version of epistemological levelism should be retained as a fundamental method, called the method of levels of abstraction. After a brief introduction, in section "Some Definitions and Preliminary Examples" the nature and applicability of the epistemological method of levels of abstraction is clarified. In section "A Classic Application of the Method of Abstraction", the philosophical fruitfulness of the new method is shown by using Kant's classic discussion of the "antinomies of pure reason" as an example. In section "The Philosophy of the Method of Abstraction", the method is further specified and supported by distinguishing it from three other forms of "levelism": (i) levels of organisation; (ii) levels of explanation and (iii) conceptual schemes. In that context, the problems of relativism and antirealism are also briefly addressed. The conclusion discusses some of the work that lies ahead, two potential limitations of the method and some results that have already been obtained by applying the method to some long-standing philosophical problems.
BibTeX:
@article{floridi_method_2008,
  author = {Floridi, Luciano},
  title = {The Method of Levels of Abstraction},
  journal = {Minds and Machines},
  year = {2008},
  volume = {18},
  number = {3},
  pages = {303--329},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlR1La9ZAcBAVEcRatTE-IKBexITsI5vssXxaCtaLL7wtyT4uwmfrl4I_35lNNl_7tYcKOSQw2d3M7Dyy8wIQvKrLHZmgNLcu2Ka1yjOnOe-50N5J13SqFn7nZBv4cpKx_lUlB2WU29vUN6o5UJLzHtlVlJRPjqqK9viXrz8WPwJdsdoel6XCf4Hk17xuhEuaaZbPdyk35HxzwU16rYKKyuhoD1KSTApCWTzT29z5q0Ha__GRj-DhbKcWh9PG2odbfv0Y9lIPiGIWCU_gNe6z4nNsQ138DsUJxSBt6O5woEOUmDXxFL4fffy2Oi7nxgulRQOQlYIzNyjLJVpTrWykCxLpZRUbJNpTgaFMYqGzrHa-1573unNhQBwPg-80CgRxAA96CtBfjzGRz2VwJyA3-Yw0XIboyuDeT33yoTv-tJoe99NjtYnZZtXZmCERIzOWqmqfQeFC7V0r1cD6IIULWjjtrEf7o7N9N4gc3iX6mdOpVIfZFmUmTBpquUmYNG0OGVHYEBsTNgxalkyhosZh3k5EXwbhZsNNbRqqrogqv-XKjH9HHGEHron2TqNyeJ_IuV1InD_2_JwpNy3k1AWacBc8Ak7vsM6ICJvDm4s7b4ElpYZsRI0KyGLPgd0EbDWXgKfSB-PzGy7hBdxP8TI1ewm3xz_n_tVUwPIfRx4fbA}
}
Dretske, F.I. The Philosophy of Information 2008 , pp. 29-48  inbook  
BibTeX:
@inbook{Dretske2008,
  author = {Dretske, Fred. I.},
  title = {The Philosophy of Information},
  publisher = {Elsevier},
  year = {2008},
  pages = {29-48}
}
Floridi, L. The philosophy of information 2011   book  
BibTeX:
@book{floridi_philosophy_2011,
  author = {Floridi, Luciano},
  title = {The philosophy of information},
  publisher = {Oxford University Press},
  year = {2011}
}
Kamp R. and Stokhof, M. The Philosophy of Information 2008   inbook  
BibTeX:
@inbook{Kamp2008,
  author = {Kamp, R., and Stokhof, M.},
  title = {The Philosophy of Information},
  publisher = {Elsevier},
  year = {2008}
}
Greco, G.M., Paronitti, G., Turilli, M. and Floridi, L. The philosophy of information a methodological point of view 2005 CEUR Workshop Proceedings
Vol. 130 
article URL 
BibTeX:
@article{greco_philosophy_2005,
  author = {Greco, Gian M. and Paronitti, Gianluca and Turilli, Matteo and Floridi, Luciano},
  title = {The philosophy of information a methodological point of view},
  journal = {CEUR Workshop Proceedings},
  year = {2005},
  volume = {130},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V3NS8MwFA-iIIKIQ-3mB-aiF2mp3Zq1oAed08HmRTfwVtKmxR1c6pqB_vfmJUvbDfYPeExbyINf-r7yfu8h1PYc117TCWrGS0a7Mhpx3ThgAck6tx7UGJLAT1RnpXpm24zPq579F-BzM6DgV7eEKCmKN2ZmdKnzcj7V1QDlHYHpW9CfvKlcevHJc00oUIaudMJfIJhWeXVQEq9UFywbLjXoCl0qAK9htlpVyjGHJI9mCgmR8uoM8fmUabr2ArItfCUn4ddyElqNEpgZ4eohJaWeXV7ArPS8XrNFZYVgm4Rel0j_5BpaoH-xaSLu05k9eZfWVjqxoMsehyOIps1W0Os14fmiqPkH40N0sHTs8YMGpIG20tkRupNg4AoMzDNcAwNTvAoGVmDAVwDGMbp87o97A1tvGOW6MUhUCu2doH0KDIWZUExGZqFtMV-kFlh4S8pmod2PcPQUDIY9vWyYpVMotp3zLSzpUKgzaROn20TYZ77HYkazJEs6JO4EIZXOBEtoLGNempIWulqTx4sKL3KjQAalYaguVCPxI1qouUnu082vztBehfQ52snkT5Je6OaVfytWPg4}
}
Floridi, L. The Philosophy of Information as a Conceptual Framework 2010 Knowledge Technology and Policy
Vol. 23, pp. 1-29 
article  
BibTeX:
@article{floridi_philosophy_2010,
  author = {Floridi, Luciano},
  title = {The Philosophy of Information as a Conceptual Framework},
  journal = {Knowledge Technology and Policy},
  year = {2010},
  volume = {23},
  pages = {1--29}
}
Floridi, L. The philosophy of information: Ten years later 2011 , pp. 153-170  incollection URL 
BibTeX:
@incollection{floridi_philosophy_2011-1,
  author = {Floridi, Luciano},
  title = {The philosophy of information: Ten years later},
  year = {2011},
  pages = {153--170},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV1bS8MwFA6CIKKg89J5g7zoy2hNk7ZLBJ-mMtjenODbSJMWB7pOuz3s33vS9LILvvs0skI5PUlPvnOa7zsIMeoRdyMmxJLrhIepr7tCCyU40QKQP2BnxRXTer2yXfcxaP775xO_gYjN8JaSuvxaTEP2M9GW8bwwBYtsc03Mqt4FS6sWUbMXbaF92lmCC_LOp6zO7eqmxhm4lmFvM0NIkgJIPaPuamjyrShvucv5tl3HVgC1gqyFEBbcQ0ScRZ76KE-erqlVb-wi9dk-yFF8CrAquDPi5V96ouaPydR9e4V90mj_mMR5MKzrYBBBDFu34NxVVldSXNW4JDuBbffblhmNV5XNFvkKLhgdowPDFcGGxAHGtdBOMj1BR1WnDFwGzlPkgedx43mcpXjF8w8Y_I4Lv-PC72do9PI86vXdslOFm0ecuHGguyqWfsLhJ1IkTbkmYRoDvEtjSrUIZaAkhWRU-IBoTY4ZMsmSmBFJUxHRc3QoDaFhOi-Ij9pBuykswsQxiMCBZ3LQ3rsYPvH-oGeHrWro5QU7z_ueOwBAijXsRl63jTAL4d0KExWRhAZhrLn5is2kBCMkYcq_QG3ruPHMCpuM66m7_PvSFdpvltx1aeeNFbH8BUyoLAc}
}
Landauer, R. The physical nature of information 1996 Physics Letters A
Vol. 217(4), pp. 188-193 
article  
BibTeX:
@article{landauer_physical_1996,
  author = {Landauer, Rolf},
  title = {The physical nature of information},
  journal = {Physics Letters A},
  year = {1996},
  volume = {217},
  number = {4},
  pages = {188--193}
}
Landauer, R. The physical nature of information 1996 Physics Letters A
Vol. 217(4), pp. 188-193 
article  
BibTeX:
@article{Landauer1996,
  author = {Landauer, Rolf},
  title = {The physical nature of information},
  journal = {Physics Letters A},
  year = {1996},
  volume = {217},
  number = {4},
  pages = {188--193}
}
Howell, R.J. The Physicalist's Tight Squeeze: A Posteriori Physicalism vs. A Priori Physicalism 2015 Philosophy Compass
Vol. 10(12), pp. 905-913 
article  
Abstract: Both a priori physicalism and a posteriori physicalism combine a metaphysical and an epistemological thesis. They agree about the metaphysical thesis: our world is wholly physical. Most agree that this requires everything that there is must be necessitated by the sort of truths described by physics. If we call the conjunction of the basic truths of physics P, all physicalists agree that P entails for any truth Q. Where they disagree is whether or not this entailment can be known a priori . The a priori physicalist says it can, the a posteriori physicalist says it cannot. Though a posteriori physicalism is probably the dominant view, it is really a surprising and somewhat unlikely stance. In this article, the nature of the view is discussed, and two arguments are presented that should cause us to look again at the potential of a priori physicalism.; Both a priori physicalism and a posteriori physicalism combine a metaphysical and an epistemological thesis. They agree about the metaphysical thesis: our world is wholly physical. Most agree that this requires everything that there is must be necessitated by the sort of truths described by physics. If we call the conjunction of the basic truths of physics P, all physicalists agree that P entails for any truth Q. Where they disagree is whether or not this entailment can be known a priori. The a priori physicalist says it can, the a posteriori physicalist says it cannot. Though a posteriori physicalism is probably the dominant view, it is really a surprising and somewhat unlikely stance. In this article, the nature of the view is discussed, and two arguments are presented that should cause us to look again at the potential of a priori physicalism.;
BibTeX:
@article{howell_physicalists_2015,
  author = {Howell, Robert J.},
  title = {The Physicalist's Tight Squeeze: A Posteriori Physicalism vs. A Priori Physicalism},
  journal = {Philosophy Compass},
  year = {2015},
  volume = {10},
  number = {12},
  pages = {905--913}
}
Sterner, B. The Practical Value of Biological Information for Research 2014 Philosophy of Science
Vol. 81(2), pp. 175-194 
article URL 
Abstract: Many philosophers are skeptical about the scientific value of the concept of biological information. However, several have recently proposed a more positive view of ascribing information as an exercise in scientific modeling. I argue for an alternative role: guiding empirical data collection for the sake of theorizing about the evolution of semantics. I clarify and expand on Bergstrom and Rosvall's suggestion of taking a diagnostic approach that defines biological information operationally as a procedure for collecting empirical cases. The more recent modeling-based accounts still perpetuate a theory-centric view of scientific concepts, which motivated philosophers' misplaced skepticism in the first place.; Many philosophers are skeptical about the scientific value of the concept of biological information. However, several have recently proposed a more positive view of ascribing information as an exercise in scientific modeling. I argue for an alternative role: guiding empirical data collection for the sake of theorizing about the evolution of semantics. I clarify and expand on Bergstrom and Rosvall’s suggestion of taking a “diagnostic” approach that defines biological information operationally as a procedure for collecting empirical cases. The more recent modeling-based accounts still perpetuate a theory-centric view of scientific concepts, which motivated philosophers’ misplaced skepticism in the first place.; Many philosophers are skeptical about the scientific value of the concept of biological information. However, several have recently proposed a more positive view of ascribing information as an exercise in scientific modeling. I argue for an alternative role: guiding empirical data collection for the sake of theorizing about the evolution of semantics. I clarify and expand on Bergstrom and Rosvall's suggestion of taking a "diagnostic" approach that defines biological information operationally as a procedure for collecting empirical cases. The more recent modeling-based accounts still perpetuate a theory-centric view of scientific concepts, which motivated philosophers' misplaced skepticism in the first place.
BibTeX:
@article{sterner_practical_2014,
  author = {Sterner, Beckett},
  title = {The Practical Value of Biological Information for Research},
  journal = {Philosophy of Science},
  year = {2014},
  volume = {81},
  number = {2},
  pages = {175--194},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB4hkFAlBGyBNECFkTjAIYsdex2HG1qoKsGhQoC4WY5jn1btls0e-PfMxEnYLhS4JXFe9jyTmfkGQJZzXuzpBKNahZ4pF1H7JgQeQ0TTtgiqDL6q-94TO3-2gV0T0Df6NTq4uqKCPVlJEsRPy7MpaiAMT5pXisKUyuz0EkqXXTE–yqYwEL9xXq72W9E8keb1Nufk3swNuYa806mYPSvcvnf87Kvndd9uDs4ouxt4pwZ3Ajnh3BwNnY2-HEIs0HuN-zlAE796gG8Qb5iCeQIqcu-utU2sIvIUk_L_thQ4UQUZ7jFxuS-h_Dl5P3n5Wkx9F8ovECzXehAGAWu9jw2kVNEVJi65D42wmnluGzdonSmFk1EJ4e3UkXtQmxKrrlvRSUfwR1HefrnXV_P12ZwK6JQhYwMXYZLmMHtb_XHd-b0wzLtzsbd-aYvOptfdhlSt5fJQs-rI2BRG-e8LutQR1WbhQmqwW9aqsoKsvUqh2cjje06IXbYPtJutE2LnENGpLckwh0umJUVAbGhLsORkRtsu1pZKalwlzDwcjhKzDHdk_hP4TLxHI6vsIsdtMBmetzzv47bdRtzeLHLYtND6JNVaoWz7MMwOYj_OW05wLsTrEH3-B-v9wQO0AkcspGews3u-zYcJ1DKnw9IGH0}
}
Dirac, P.A.M. The Principles of Quantum Mechanics 1981   book  
BibTeX:
@book{dirac,
  author = {Paul Adrien Maurice Dirac},
  title = {The Principles of Quantum Mechanics},
  publisher = {Clarendon Press},
  year = {1981}
}
Berta, M., Christandl, M. and Renner, R. The Quantum Reverse Shannon Theorem Based on One-Shot Information Theory 2011 Communications in Mathematical Physics
Vol. 306(3), pp. 579-615 
article  
Abstract: The Quantum Reverse Shannon Theorem states that any quantum channel can be simulated by an unlimited amount of shared entanglement and an amount of classical communication equal to the channel's entanglement assisted classical capacity. In this paper, we provide a new proof of this theorem, which has previously been proved by Bennett, Devetak, Harrow, Shor, and Winter. Our proof has a clear structure being based on two recent information-theoretic results: one-shot Quantum State Merging and the Post-Selection Technique for quantum channels.;The Quantum Reverse Shannon Theorem states that any quantum channel can be simulated by an unlimited amount of shared entanglement and an amount of classical communication equal to the channel's entanglement assisted classical capacity. In this paper, we provide a new proof of this theorem, which has previously been proved by Bennett, Devetak, Harrow, Shor, and Winter. Our proof has a clear structure being based on two recent information-theoretic results: one-shot Quantum State Merging and the Post-Selection Technique for quantum channels.;The Quantum Reverse Shannon Theorem states that any quantum channel can be simulated by an unlimited amount of shared entanglement and an amount of classical communication equal to the channel's entanglement assisted classical capacity. In this paper, we provide a new proof of this theorem, which has previously been proved by Bennett, Devetak, Harrow, Shor, and Winter. Our proof has a clear structure being based on two recent information-theoretic results: one-shot Quantum State Merging and the Post-Selection Technique for quantum channels.;The Quantum Reverse Shannon Theorem states that any quantum channel can be simulated by an unlimited amount of shared entanglement and an amount of classical communication equal to the channel’s entanglement assisted classical capacity. In this paper, we provide a new proof of this theorem, which has previously been proved by Bennett, Devetak, Harrow, Shor, and Winter. Our proof has a clear structure being based on two recent information-theoretic results: one-shot Quantum State Merging and the Post-Selection Technique for quantum channels.;
BibTeX:
@article{berta_quantum_2011,
  author = {Berta, Mario and Christandl, Matthias and Renner, Renato},
  title = {The Quantum Reverse Shannon Theorem Based on One-Shot Information Theory},
  journal = {Communications in Mathematical Physics},
  year = {2011},
  volume = {306},
  number = {3},
  pages = {579--615}
}
Godfrey-Smith, P. The role of information and replication in selection processes 2001 Behavioral and Brain Sciences
Vol. 24(3), pp. 538-538 
article URL 
Abstract: Hull et al. argue that information and replication are both essential ingredients in any selection process, But both information and replication are found in only some selection processes, and should not be included in abstract descriptions of selection intended to help researchers discover and describe selection processes in new domains.; Hull et al. argue that information and replication are both essential ingredients in any selection process. But both information and replication are found in only some selection processes, and should not be included in abstract descriptions of selection intented to help researchers deiscover describe selection processes in new domains.
BibTeX:
@article{godfrey-smith_role_2001,
  author = {Godfrey-Smith, Peter},
  title = {The role of information and replication in selection processes},
  journal = {Behavioral and Brain Sciences},
  year = {2001},
  volume = {24},
  number = {3},
  pages = {538--538},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1NT9wwEB2VVkVIiBZYwvIh-VBxyyp2nMS5VNpuu1qpHEHiZjmJc6LLsglS-fedsZOwYjnQW6xYVjIej2fsN28AYjGJwlc2wYqqzCL0NWxZJBYtYCpLzguVlDwzyhVz2DjZhr5oXydPlH6fwOSpgws6BihbD01KRHJHNwboWmDsvoP7MwXts-kL0kO6mmkOyUidN2kUti0xcYaWD6un5s2tyG078y_Qg2t7uMlwB_2SJb8Nx_7v3_kKB51fyqZekQ7hg10ewfF0iTH5n2d2xRxS1B3BH8HnH_3T3mA_n4_hO6ocI7Qie6hZR8hK087MsmJrO1yU4zvWuPI71Fj5TAXbjOB2_utmtgi78gxhyWMVhdLEhlsMUEwhDI8MrnxZ0y1jin6AyLmN05ro3vOstrGwBUZyiSINiA12RNNyAvuGYPzL1qX7VQF8qlE6NqB9MEBRB7B7l1__VIvfM9887JuTxuWkTR7bAGfZLdkwnWSnwFRVFUrlsjCmlkkk8NuUTY0UprRZpaoxfBukr_sJ2Zb8GE5ILzRJiyZHZ-iKJnHOxzDqNUVX9_dayVzyXAoc16uNXnmaEC10I7QjX5Wu4CTX7d92DMGrbiiqLI2kxAE21W147ymN0O2j0hUZfhh_T7dZx_BOzAbt2fv–Rz2PMKOzpgu4GO7frKXnp7yHzt4HnM}
}
Peacocke, A. The sciences of complexity: a new theological resource? 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 249-281  incollection  
BibTeX:
@incollection{peacocke_sciences_2010,
  author = {Peacocke, Arthur},
  title = {The sciences of complexity: a new theological resource?},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {249--281},
  note = {DOI: 10.1017/CBO9780511778759.012}
}
Hutter, M. The subjective computable universe 2012   incollection  
BibTeX:
@incollection{hutter_subjective_2012,
  author = {Hutter, Marcus},
  title = {The subjective computable universe},
  year = {2012}
}
Clifton, R. The Subtleties of Entanglement and its Role in Quantum Information Theory 2002 Philosophy of Science
Vol. 69(S3), pp. S150-S167 
article URL 
Abstract: My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, entanglement-assisted communication, and entanglement thermodynamics.;My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, entanglement‐assisted communication, and entanglement thermodynamics.;My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, entanglement-assisted communication, and entanglement thermodynamics.;My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, entanglement-assisted communication, and entanglement thermodynamics.;
BibTeX:
@article{clifton_subtleties_2002,
  author = {Clifton, Rob},
  title = {The Subtleties of Entanglement and its Role in Quantum Information Theory},
  journal = {Philosophy of Science},
  year = {2002},
  volume = {69},
  number = {S3},
  pages = {S150--S167},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1LS-wwFD7IXQni9ckdH5iFC12MNo9pm5WIKLpT0XVJmkSEufUxHcR_78mj1REFcd0mbXNOTr7mfPkOAGcH2fBTTKip0qXhNhd-00Fqbh2T1jlhaV1IJ2d3tt-pOi8vB4ElGHL6CJf02B525YEw_paCHT0-DX0NKZ9rTQU1MCBTluVR7LVPKdAyi2GZ02HJfL2fDwtRCseRkjgDNmdrkny5PIWl6OwvdLymjoLS56XfT85_QdH-1ScuwWICrOQ4etgyzNlmBeYvuwoIryuwnOLDhOwlEev9VbhA_yMYlVov7o2XHhw5bRCI3kW6OlGNIffthFw_jC25b8jVFE08_U_S8SjvLiTqBqzB7dnpzcn5MJVtQCsjPsLwKZXKhC40M1TXrFA6q1WBSCqXupSI8ZzDvxwj8pEyghZFZka8kLLWSkgjHV-HBeXp_U0bjgGaf0BUzUuD0MfpmgvsVynFJLW5B2RcCD6Anc6Y1WOU6ahCer3MqziIA1j3Nq78vG2fVV1JLkb-j3wAa2H8-3YcUVImM2zQeUFlxuOKeSRFc99ge8YnqjTrJ_2Tdj_6SN9v0CbyAoBFOMA-APqT206SVLuXKGg3vnnXTZgPVWoC920L_rTPU7sdhSXfAFSfEhM}
}
Bringsjord, S. The symbol grounding problem . remains unsolved 2015 Journal of Experimental & Theoretical Artificial Intelligence
Vol. 27(1), pp. 63-72 
article  
Abstract: Taddeo and Floridi [2007. A praxical solution of the symbol grounding problem. Minds and Machines, 17, 369-389. (This paper is reprinted in Floridi, L. (2011). The philosophy of information. Oxford: Oxford University Press)] propose a solution to the symbol grounding problem (SGP). Unfortunately, their proposal, while certainly innovative, interesting and – given the acute difficulty of SGP – brave, merely shows that a class of robots can in theory connect, in some sense, the symbols it manipulates with the external world it perceives, and can, on the strength of that connection, communicate in sub-human fashion.; Taddeo and Floridi [2007. A praxical solution of the symbol grounding problem. Minds and Machines, 17, 369-389. (This paper is reprinted in Floridi, L. (2011). The philosophy of information. Oxford: Oxford University Press)] propose a solution to the symbol grounding problem (SGP). Unfortunately, their proposal, while certainly innovative, interesting and – given the acute difficulty of SGP – brave, merely shows that a class of robots can in theory connect, in some sense, the symbols it manipulates with the external world it perceives, and can, on the strength of that connection, communicate in sub-human fashion.;Taddeo and Floridi [2007. A praxical solution of the symbol grounding problem. Minds and Machines, 17, 369-389. (This paper is reprinted in Floridi, L. (2011). The philosophy of information. Oxford: Oxford University Press)] propose a solution to the symbol grounding problem (SGP). Unfortunately, their proposal, while certainly innovative, interesting and - given the acute difficulty of SGP - brave, merely shows that a class of robots can in theory connect, in some sense, the symbols it manipulates with the external world it perceives, and can, on the strength of that connection, communicate in sub-human fashion.;
BibTeX:
@article{bringsjord_symbol_2015,
  author = {Bringsjord, Selmer},
  title = {The symbol grounding problem . remains unsolved},
  journal = {Journal of Experimental & Theoretical Artificial Intelligence},
  year = {2015},
  volume = {27},
  number = {1},
  pages = {63--72}
}
McEliece, R.J. The theory of information and coding 2002
Vol. 86 
book  
BibTeX:
@book{mceliece_theory_2002,
  author = {McEliece, Robert J.},
  title = {The theory of information and coding},
  publisher = {Cambridge University Press},
  year = {2002},
  volume = {86},
  edition = {2nd}
}
Bergstrom, C.T. and Rosvall, M. The transmission sense of information 2011 Biology & Philosophy
Vol. 26(2), pp. 159-176 
article URL 
Abstract: Biologists rely heavily on the language of information, coding, and transmission that is commonplace in the field of information theory developed by Claude Shannon, but there is open debate about whether such language is anything more than facile metaphor. Philosophers of biology have argued that when biologists talk about information in genes and in evolution, they are not talking about the sort of information that Shannon's theory addresses. First, philosophers have suggested that Shannon's theory is only useful for developing a shallow notion of correlation, the so-called "causal sense" of information. Second, they typically argue that in genetics and evolutionary biology, information language is used in a "semantic sense," whereas semantics are deliberately omitted from Shannon's theory. Neither critique is well-founded. Here we propose an alternative to the causal and semantic senses of information: a transmission sense of information, in which an object X conveys information if the function of X is to reduce, by virtue of its sequence properties, uncertainty on the part of an agent who observes X. The transmission sense not only captures much of what biologists intend when they talk about information in genes, but also brings Shannon's theory back to the fore. By taking the viewpoint of a communications engineer and focusing on the decision problem of how information is to be packaged for transport, this approach resolves several problems that have plagued the information concept in biology, and highlights a number of important features of the way that information is encoded, stored, and transmitted as genetic sequence. Keywords Information * Evolution * Shannon theory * Natural selection * Entropy * Mutual information;Biologists rely heavily on the language of information, coding, and transmission that is commonplace in the field of information theory developed by Claude Shannon, but there is open debate about whether such language is anything more than facile metaphor. Philosophers of biology have argued that when biologists talk about information in genes and in evolution, they are not talking about the sort of information that Shannon’s theory addresses. First, philosophers have suggested that Shannon’s theory is only useful for developing a shallow notion of correlation, the so-called “causal sense” of information. Second, they typically argue that in genetics and evolutionary biology, information language is used in a “semantic sense,” whereas semantics are deliberately omitted from Shannon’s theory. Neither critique is well-founded. Here we propose an alternative to the causal and semantic senses of information: a transmission sense of information, in which an object X conveys information if the function of X is to reduce, by virtue of its sequence properties, uncertainty on the part of an agent who observes X. The transmission sense not only captures much of what biologists intend when they talk about information in genes, but also brings Shannon’s theory back to the fore. By taking the viewpoint of a communications engineer and focusing on the decision problem of how information is to be packaged for transport, this approach resolves several problems that have plagued the information concept in biology, and highlights a number of important features of the way that information is encoded, stored, and transmitted as genetic sequence.;Biologists rely heavily on the language of information, coding, and transmission that is commonplace in the field of information theory developed by Claude Shannon, but there is open debate about whether such language is anything more than facile metaphor. Philosophers of biology have argued that when biologists talk about information in genes and in evolution, they are not talking about the sort of information that Shannon’s theory addresses. First, philosophers have suggested that Shannon’s theory is only useful for developing a shallow notion of correlation, the so-called “causal sense” of information. Second, they typically argue that in genetics and evolutionary biology, information language is used in a “semantic sense,” whereas semantics are deliberately omitted from Shannon’s theory. Neither critique is well-founded. Here we propose an alternative to the causal and semantic senses of information: atransmission sense of information, in which an object X conveys information if the function of X is to reduce, by virtue of its sequence properties, uncertainty on the part of an agent who observes X. The transmission sense not only captures much of what biologists intend when they talk about information in genes, but also brings Shannon’s theory back to the fore. By taking the viewpoint of a communications engineer and focusing on the decision problem of how information is to be packaged for transport, this approach resolves several problems that have plagued the information concept in biology, and highlights a number of important features of the way that information is encoded, stored, and transmitted as genetic sequence.; Biologists rely heavily on the language of information, coding, and transmission that is commonplace in the field of information theory developed by Claude Shannon, but there is open debate about whether such language is anything more than facile metaphor. Philosophers of biology have argued that when biologists talk about information in genes and in evolution, they are not talking about the sort of information that Shannon's theory addresses. First, philosophers have suggested that Shannon's theory is only useful for developing a shallow notion of correlation, the so-called "causal sense" of information. Second, they typically argue that in genetics and evolutionary biology, information language is used in a "semantic sense," whereas semantics are deliberately omitted from Shannon's theory. Neither critique is well-founded. Here we propose an alternative to the causal and semantic senses of information: a transmission sense of information, in which an object X conveys information if the function of X is to reduce, by virtue of its sequence properties, uncertainty on the part of an agent who observes X. The transmission sense not only captures much of what biologists intend when they talk about information in genes, but also brings Shannon's theory back to the fore. By taking the viewpoint of a communications engineer and focusing on the decision problem of how information is to be packaged for transport, this approach resolves several problems that have plagued the information concept in biology, and highlights a number of important features of the way that information is encoded, stored, and transmitted as genetic sequence.[PUBLICATION ABSTRACT];
BibTeX:
@article{bergstrom_transmission_2011,
  author = {Bergstrom, Carl T. and Rosvall, Martin},
  title = {The transmission sense of information},
  journal = {Biology & Philosophy},
  year = {2011},
  volume = {26},
  number = {2},
  pages = {159--176},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlR3LTsMwzEKTkHbh_Sgv9bILoqhp1jQ9TjzEEfHYNWqSFiHGNo3tsH09drt0DzjAuW6b2I7fdgB4dB0GazIhC41I0kgYZpi0uc1MEXFOXZJo4WmzFtmGqI5k9D-uXYKylNtLrW8xTwMK7adMhsEMpTCqKirqe3ru1rKYtH013TsNuBSJy2v-9oUVzbQun5cSpWtDRUtFdL8NrkHGFaDUWelF3_zPAu1_bHAHtuY2qt-pmGoXNvL-HmxWt1ZO96D56K4_mO5DC_nMH5PCQ4ahyJv_hY5x7g8Kfz6TlSh_AK_3dy83D8H86oXAcBR_gUTH1RiBtofmGj3omBk6_UJIxrVMbWQE0xwPP7OJLqTJTWyNJW8MPSibhfwQGv1BPz8Gv9BxKtNE5zph7cKkWrSLKDNJlqHxqUPtwaVDuxpWEzbUYpYyIUEhEhQhQc08uCDCqKpJtD6diuPi0BiKpQdHJQBtETdvVp44Yirb6ynkP4nmWZQwD64cCZZWQD-m1JGaY7tawdAWHrR-gJMZ5d4RKiphCa7kkxqKRnnfvnc7ajB6U5PPiWrTNP-TP37vFJpVQJsK4M6gMR5N8vNqeuQ3PgD9Zw}
}
Lloyd, S. The universe as quantum computer 2012   incollection  
BibTeX:
@incollection{lloyd_universe_2012,
  author = {Lloyd, Seth},
  title = {The universe as quantum computer},
  year = {2012}
}
Lloyd, S. The universe as quantum computer 2013   article  
Abstract: This article reviews the history of digital computation, and investigates just how far the concept of computation can be taken. In particular, I address the question of whether the universe itself is in fact a giant computer, and if so, just what kind of computer it is. I will show that the universe can be regarded as a giant quantum computer. The quantum computational model of the universe explains a variety of observed phenomena not encompassed by the ordinary laws of physics. In particular, the model shows that the the quantum computational universe automatically gives rise to a mix of randomness and order, and to both simple and complex systems.
BibTeX:
@article{lloyd_universe_2013,
  author = {Lloyd, Seth},
  title = {The universe as quantum computer},
  year = {2013}
}
Harms, W.F. The Use of Information Theory in Epistemology 1998 Philosophy of Science
Vol. 65(3), pp. 472-501 
article URL 
Abstract: Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist.;Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not containing payoffs as terms, mutual information places both upper and lower bounds on payoffs. This constitutes a non-trivial relationship to utility.;
BibTeX:
@article{Harms1998,
  author = {Harms, W. F.},
  title = {The Use of Information Theory in Epistemology},
  journal = {Philosophy of Science},
  year = {1998},
  volume = {65},
  number = {3},
  pages = {472-501},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV09T8MwED2hTl2AfkGACo8wBOzYcdwJIdSKkYHOke3aElJpS1sG_j127LQN6sIcRYl99vmd7909AJo94PSPTygkMVZypVjOJNYqU1xoRSxhucayKlI7uNmGWrjQkywrlmCV03dwSc3NI3GgUJCn1VfqtaN8jjUKaThHTDLMQ5PXXSqBCBzcMSWpyLzOz8EBFN1woCI2QGZTi-TgzJmcQU0frbkmuwT0vkT-CBf7X2M5h9OISNFzWEIdODGLLrTfaomDny50ogPYoLvYpfq-B6lbYGi6MWhpUaxp8jZGodgffSzQeOWX0Gd1b9-H6WT8_vKaRu2FVPuG7qkQzBQOLElmc8ZdxK2sMtT6ZjOeGWUEppab3DcszLixwsHKGXVDkVSJGdaGDqC1WC7MJSAPejLGJdNcMReQjrDVSikHlKx1DgAncFsbpFyFFhtllRoXvHTojedFAgNvp9Lvue1a6tIFSS6yxiSBXjWl-_eq-UzgojZkOZvPS4-ZMHdgkyXQbzyhDn6NCpInMGzYu4w7eRP_4Or4h66hHWoUPefsBlrb9bcZhoaOv_iZ5SU}
}
Fyffe, R. The Value of Information: Normativity, Epistemology, and LIS in Luciano Floridi 2015 PORTAL-LIBRARIES AND THE ACADEMY
Vol. 15(2), pp. 267-286 
article  
Abstract: This paper is a critical reconstruction of Luciano Floridi's view of librarianship as "stewardship of a semantic environment," a view that is at odds with the dominant tradition in which library and information science (LIS) is understood as social epistemology. Floridi's work helps to explain the normative dimensions of librarianship in ways that epistemology does not, and his Philosophy of Information frames librarians' traditional stewardship role in terms appropriate for our growing involvement in the management and preservation of information through its entire life cycle. Floridi's work also helps illuminate what is coming to be called "knowledge as a commons." Librarianship is concerned with maintaining and enhancing information environments over time, environments that include the behavior of the people who create and use them. The integrity of these environments makes possible the epistemic projects of faculty, students, and other researchers, but librarianship is not, itself, epistemological. Floridi's ecological reframing of philosophy of information and information ethics, bridging the dichotomy between information and user, has a variety of implications for information literacy education and other academic library services in higher education.;This paper is a critical reconstruction of Luciano Floridi's view of librarianship as "stewardship of a semantic environment," a view that is at odds with the dominant tradition in which library and information science (LIS) is understood as social epistemology. Floridi's work helps to explain the normative dimensions of librarianship in ways that epistemology does not, and his Philosophy of Information frames librarians' traditional stewardship role in terms appropriate for our growing involvement in the management and preservation of information through its entire life cycle. Floridi's work also helps illuminate what is coming to be called "knowledge as a commons." Librarianship is concerned with maintaining and enhancing information environments over time, environments that include the behavior of the people who create and use them. The integrity of these environments makes possible the epistemic projects of faculty, students, and other researchers, but librarianship is not, itself, epistemological. Floridi's ecological reframing of philosophy of information and information ethics, bridging the dichotomy between information and user, has a variety of implications for information literacy education and other academic library services in higher education.;This paper is a critical reconstruction of Luciano Floridi's view of librarianship as "stewardship of a semantic environment," a view that is at odds with the dominant tradition in which library and information science (LIS) is understood as social epistemology. Floridi's work helps to explain the normative dimensions of librarianship in ways that epistemology does not, and his Philosophy of Information frames librarians' traditional stewardship role in terms appropriate for our growing involvement in the management and preservation of information through its entire life cycle. Floridi's work also helps illuminate what is coming to be called "knowledge as a commons." Librarianship is concerned with maintaining and enhancing information environments over time, environments that include the behavior of the people who create and use them. The integrity of these environments makes possible the epistemic projects of faculty, students, and other researchers, but librarianship is not, itself, epistemological. Floridi's ecological reframing of philosophy of information and information ethics, bridging the dichotomy between information and user, has a variety of implications for information literacy education and other academic library services in higher education.;
BibTeX:
@article{fyffe_value_2015,
  author = {Fyffe, R.},
  title = {The Value of Information: Normativity, Epistemology, and LIS in Luciano Floridi},
  journal = {PORTAL-LIBRARIES AND THE ACADEMY},
  year = {2015},
  volume = {15},
  number = {2},
  pages = {267--286}
}
Berto, F. and Tagliabue, J. The World Is either Digital or Analogue 2014 Synthese
Vol. 191(3), pp. 481-497 
article  
Abstract: We address an argument by Floridi (Synthese 168(1):151-178, 2009; 2011a), to the effect that digital and analogue are not features of reality, only of modes of presentation of reality. One can therefore have an informational ontology, like Floridi’s Informational Structural Realism, without commitment to a supposedly digital or analogue world. After introducing the topic in Sect. 1, in Sect. 2 we explain what the proposition expressed by the title of our paper means. In Sect. 3, we describe Floridi’s argument. In the following three sections, we raise three difficulties for it, (i) an objection from intuitions: Floridi’s view is not supported by the intuitions embedded in the scientific views he exploits (Sect. 4); (ii) an objection from mereology: the view is incompatible with the world’s having parts (Sect. 5); (iii) an objection from counting: the view entails that the question of how many things there are doesn’t make sense (Sect. 6). In Sect. 7, we outline two possible ways out for Floridi’s position. Such ways out involve tampering with the logical properties of identity, and this may be bothersome enough. Thus, Floridi’s modus ponens will be our (and most ontologists’) modus tollens.;We address an argument by Floridi (Synthese 168(1):151–178, 2009; 2011a), to the effect that digital and analogue are not features of reality, only of modes of presentation of reality. One can therefore have an informational ontology, like Floridi’s Informational Structural Realism, without commitment to a supposedly digital or analogue world. After introducing the topic in Sect. 1, in Sect. 2 we explain what the proposition expressed by the title of our paper means. In Sect. 3, we describe Floridi’s argument. In the following three sections, we raise three difficulties for it, (i) an objection from intuitions: Floridi’s view is not supported by the intuitions embedded in the scientific views he exploits (Sect. 4); (ii) an objection from mereology: the view is incompatible with the world’s having parts (Sect. 5); (iii) an objection from counting: the view entails that the question of how many things there are doesn’t make sense (Sect. 6). In Sect. 7, we outline two possible ways out for Floridi’s position. Such ways out involve tampering with the logical properties of identity, and this may be bothersome enough. Thus, Floridi’s modus ponens will be our (and most ontologists’) modus tollens.;We address an argument by Floridi (Synthese 168(1):151–178, 2009 2011a), to the effect that digital and analogue are not features of reality, only of modes of presentation of reality. One can therefore have an informational ontology, like Floridi's Informational Structural Realism, without commitment to a supposedly digital or analogue world. After introducing the topic in Sect. 1, in Sect. 2 we explain what the proposition expressed by the title of our paper means. In Sect. 3, we describe Floridi's argument. In the following three sections, we raise three difficulties for it, (i) an objection from intuitions: Floridi's view is not supported by the intuitions embedded in the scientific views he exploits (Sect. 4) (ii) an objection from mereology: the view is incompatible with the world's having parts (Sect. 5) (iii) an objection from counting: the view entails that the question of how many things there are doesn't make sense (Sect. 6). In Sect. 7, we outline two possible ways out for Floridi's position. Such ways out involve tampering with the logical properties of identity, and this may be bothersome enough. Thus, Floridi's modus ponens will be our (and most ontologists') modus tollens.; We address an argument by Floridi (Synthese 168(1):151-178, 2009 ; 2011a ), to the effect that digital and analogue are not features of reality, only of modes of presentation of reality. One can therefore have an informational ontology, like Floridi's Informational Structural Realism, without commitment to a supposedly digital or analogue world. After introducing the topic in Sect. 1, in Sect. 2 we explain what the proposition expressed by the title of our paper means. In Sect. 3, we describe Floridi's argument. In the following three sections, we raise three difficulties for it, (i) an objection from intuitions: Floridi's view is not supported by the intuitions embedded in the scientific views he exploits (Sect. 4); (ii) an objection from mereology: the view is incompatible with the world's having parts (Sect. 5); (iii) an objection from counting: the view entails that the question of how many things there are doesn't make sense (Sect. 6). In Sect. 7, we outline two possible ways out for Floridi's position. Such ways out involve tampering with the logical properties of identity, and this may be bothersome enough. Thus, Floridi's modus ponens will be our (and most ontologists') modus tollens.[PUBLICATION ABSTRACT];
BibTeX:
@article{berto_world_2014,
  author = {Berto, F. and Tagliabue, J.},
  title = {The World Is either Digital or Analogue},
  journal = {Synthese},
  year = {2014},
  volume = {191},
  number = {3},
  pages = {481--497}
}
Floridi, L. TheLogic of Being Informed 2006 Logique & Analyse, pp. 2-27  article  
BibTeX:
@article{Floridi2006,
  author = {Floridi, L.},
  title = {TheLogic of Being Informed},
  journal = {Logique & Analyse},
  year = {2006},
  pages = {2-27}
}
Brüning, E. and Petruccione, F. Theoretical foundations of quantum information processing and communication: selected topics 2010
Vol. 787 
book URL 
Abstract: Based on eight extensive lectures selected from those given at the renowned Chris Engelbrecht Summer School in Theoretical Physics in South Africa, this text on the theoretical foundations of quantum information processing and communication covers an array of topics, including quantum probabilities, open systems, and non-Markovian dynamics and decoherence. It also addresses quantum information and relativity as well as testing quantum mechanics in high energy physics. Because these self-contained lectures discuss topics not typically covered in advanced undergraduate courses, they are ideal for post-graduate students entering this field of research. Some of the lectures are written at a more introductory level while others are presented as tutorials that survey recent developments and results in various subfields.
BibTeX:
@book{bruning_theoretical_2010,
  author = {Brüning, Erwin and Petruccione, F.},
  title = {Theoretical foundations of quantum information processing and communication: selected topics},
  publisher = {Springer},
  year = {2010},
  volume = {787},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV3PT4MwFH7RGRO9qNMpziWNNw-YljrAkwd1MdGLye4NLcUsTjYHM_75vhbo2NQjtBB4L319v76vADy4pv6GTZBc3sQ0U1kYKorBXKo0HcokTgOFMZhk65ltaDqJmmYOV-FdYdB_NzvbRvmmmmmNfH0A1d3bisrARU7cR8_bp6bO52PIvx3bZf3y8OoyNJQHaLhvDSAEp9qZw4qyx12bjElSvKM5QlNVFg6sZPhL8mmruLrG8me3rtEh7GiDZziCLZ134aA5xYHUi7oL-y1Kwi7s2pZQVRzD83iFcSSZO32pILOMfC5RIcsPUtOumvtkXkEO8C0kyVOi2siTExiMHsf3T775PFGnjITk6HSbnZv3oJPPcn0GBAPJUEdK0hR9sCTikg81uhchyyhTMcs8uGyJQnxNbbW3EC15sciD00pCYl6RaQhmSpkxCz3ob4yU32Vr1EOBCjmR00lN5W1oygOO7pIHV42Y3bPrGmZ-JFCTUUw96P39o-f_DfRhr2oYMFmXC-iUi6UeVNr8AR1z21o}
}
Ibekwe-SanJuan, F. and Dousa, T.M. Theories of Information, Communication and Knowledge: A Multidisciplinary Approach 2013
Vol. 34 
book  
Abstract: This book addresses some of the key questions that scientists have been asking themselves for centuries: what is knowledge? What is information? How do we know that we know something? How do we construct meaning from the perceptions of things? Although no consensus exists on a common definition of the concepts of information and communication, few can reject the hypothesis that information - whether perceived as « object » or as « process » - is a pre-condition for knowledge. Epistemology is the study of how we know things (anglophone meaning) or the study of how scientific knowledge is arrived at and validated (francophone conception). To adopt an epistemological stance is to commit oneself to render an account of what constitutes knowledge or in procedural terms, to render an account of when one can claim to know something. An epistemological theory imposes constraints on the interpretation of human cognitive interaction with the world. It goes without saying that different epistemological theories will have more or less restrictive criteria to distinguish what constitutes knowledge from what is not. If information is a pre-condition for knowledge acquisition, giving an account of how knowledge is acquired should impact our comprehension of information and communication as concepts. While a lot has been written on the definition of these concepts, less research has attempted to establish explicit links between differing theoretical conceptions of these concepts and the underlying epistemological stances. This is what this volume attempts to do. It offers a multidisciplinary exploration of information and communication as perceived in different disciplines and how those perceptions affect theories of knowledge.;This book addresses some of the key questions that scientists have been asking themselves for centuries: what is knowledge? What is information? How do we know that we know something? How do we construct meaning from the perceptions of things? Although no consensus exists on a common definition of the concepts of information and communication, few can reject the hypothesis that information - whether perceived as object or as process - is a pre-condition for knowledge. Epistemology is the study of how we know things (anglophone meaning) or the study of how scientific knowledge is arrived at and validated (francophone conception). To adopt an epistemological stance is to commit oneself to render an account of what constitutes knowledge or in procedural terms, to render an account of when one can claim to know something. An epistemological theory imposes constraints on the interpretation of human cognitive interaction with the world. It goes without saying that different epistemological theories will have more or less restrictive criteria to distinguish what constitutes knowledge from what is not. If information is a pre-condition for knowledge acquisition, giving an account of how knowledge is acquired should impact our comprehension of information and communication as concepts. While a lot has been written on the definition of these concepts, less research has attempted to establish explicit links between differing theoretical conceptions of these concepts and the underlying epistemological stances. This is what this volume attempts to do. It offers a multidisciplinary exploration of information and communication as perceived in different disciplines and how those perceptions affect theories of knowledge.;
BibTeX:
@book{ibekwe-sanjuan_theories_2013,
  author = {Ibekwe-SanJuan, Fidelia and Dousa, Thomas M.},
  title = {Theories of Information, Communication and Knowledge: A Multidisciplinary Approach},
  publisher = {Springer Netherlands},
  year = {2013},
  volume = {34},
  edition = {1;2014;}
}
Zurek, W.H. Thermodynamic cost of computation, algorithmic complexity and the information metric 1989 Nature
Vol. 341(6238), pp. 119-124 
article  
Abstract: Algorithmic complexity is a measure of randomness, which is defined without a recourse to probabilities. A study shows that algorithmic complexity sets limits on the thermodynamic costs of computations, and casts a new light on the lomitations of Maxwell's demon and can be used to define distance between binary strings.; Algorithmic complexity is a measure of randomness, which is defined without a recourse to probabilities. A study shows that algorithmic complexity sets limits on the thermodynamic costs of computations, and casts a new light on the lomitations of Maxwell's demon and can be used to define distance between binary strings.;
BibTeX:
@article{zurek_thermodynamic_1989,
  author = {Zurek, W. H.},
  title = {Thermodynamic cost of computation, algorithmic complexity and the information metric},
  journal = {Nature},
  year = {1989},
  volume = {341},
  number = {6238},
  pages = {119--124}
}
Bennett, C.H., Gács, P., Li, M., Vitányi, P.M.B. and Zurek, W.H. Thermodynamics of Computation and Information Distance 1993 Proceedings of the Twenty-fifth Annual ACM Symposium on Theory of Computing, pp. 21-30  inproceedings DOI URL 
BibTeX:
@inproceedings{bennett_thermodynamics_1993,
  author = {Bennett, Charles H. and Gács, Péter and Li, Ming and Vitányi, Paul M. B. and Zurek, Wojciech H.},
  title = {Thermodynamics of Computation and Information Distance},
  booktitle = {Proceedings of the Twenty-fifth Annual ACM Symposium on Theory of Computing},
  publisher = {ACM},
  year = {1993},
  pages = {21--30},
  url = {http://doi.acm.org/10.1145/167088.167098},
  doi = {http://doi.org/10.1145/167088.167098}
}
Tribus, M. Thermostatics and thermodynamics: an introduction to energy, information and states of matter, with engineering applications 1961   book URL 
BibTeX:
@book{tribus_thermostatics_1961,
  author = {Tribus, Myron},
  title = {Thermostatics and thermodynamics: an introduction to energy, information and states of matter, with engineering applications},
  publisher = {Van Nostrand},
  year = {1961},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwdV3NCoMwDA5zYzDYYX-yP8EXcMTq1J3HZA-wu7Q23uZg-v4sVXdwuGNbSAlNmnwhXwsQiBN6P3cCSRQFKZVgWFCsIhI5-hpjJXXMMT3qV7aHcGOPgP6tYHA-jJxRWGAxyGu9sveeXhMk0gWMDXFgCSMqVzBtWivzag0un8T7-TLMHR67jNzdupnR7Wfw1Qac9Pa43j0jMevqKZnqdhXChrk0fehl3fDV9BZcnUgKdEIy1EXIwZkVzjFUEdJZE15oB_awsP2_hQPM2Ev8FvYfYVKwWZLTavkBsuplZg}
}
Floridi, L. Things 2013 Philosophy & Technology
Vol. 26(4), pp. 349-352 
article URL 
Abstract: Issue Title: Online Security and Civil Rights
BibTeX:
@article{floridi_things_2013,
  author = {Floridi, Luciano},
  title = {Things},
  journal = {Philosophy & Technology},
  year = {2013},
  volume = {26},
  number = {4},
  pages = {349--352},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV07T8MwED6higEGHuVVBFIHNpSqfsXNiBAVI4KK1bIdm4EqVFUZ-PfcJXGLGgZYnUuknOzzd6_vAAQfjbMtm4Ag27uSuaB0YTULHFGG1sSWpqVzeiuyDXwdyajeRylBWdvtTeubEJKKJqkaSBQZWWG8qqio7_nldR1lwS2Xi3puK0ffJlNSiJTa_O0jXbvcSZDW9870EFI_TKo3WSehN23y3Xrsf_zPERy0kHR41-yhY9gJVR_2fxAV9mHvKU08-DqB3WbW5ynMpg-z-8esHaeQvaHfV2RMOfQfonYxKEkrimsWpUUQwhzLbeRFrtEOo320hfKTOHbeR0QU2ltiaT-DXvVRhQsYaj8unVfC5TbIiSonTDtZ5hHRkIrcygGckxoNnZHV0npDFDKSfCV8kjRryvncCEaJT-p2HcBtUoxZNEQbptaJQYgiTKsDXCrMoowDuOlI13LtK7mRJHr5N7Er6K2Wn-G6oWL8Br6Bwic}
}
Sasaki, M. and Yeom, D.-h. Thin-shell bubbles and information loss problem in anti de Sitter background 2014 Journal of High Energy Physics
Vol. 2014(12), pp. 1-16 
article URL 
Abstract: We study the motion of thin-shell bubbles and their tunneling in anti de Sitter (AdS) background. We are interested in the case when the outside of a, shell is a, Schwarzschild-AdS space (false vacuum) and the inside of it is an AdS space with a lower vacuum energy (true vacuum). If a collapsing true vacuum bubble is created, classically it will form a Schwarzschild-AdS black hole. However, this collapsing bubble can tunnel to a bouncing bubble that moves out to spatial infinity. Then, although the classical causal structure of a collapsing true vacuum bubble has the singularity and the event horizon, quantum mechanically the wavefunction has support for a history without any singularity nor event horizon which is mediated by the non-perturbative, quantum tunneling effect. This may be regarded an explicit example that shows the unitarity of an asymptotic observer in AdS, while a classical observer who only follows the most probable history effectively lose information due to the formation of an event horizon.; We study the motion of thin-shell bubbles and their tunneling in anti de Sitter (AdS) background. We are interested in the case when the outside of a shell is a Schwarzschild-AdS space (false vacuum) and the inside of it is an AdS space with a lower vacuum energy (true vacuum). If a collapsing true vacuum bubble is created, classically it will form a Schwarzschild-AdS black hole. However, this collapsing bubble can tunnel to a bouncing bubble that moves out to spatial infinity. Then, although the classical causal structure of a collapsing true vacuum bubble has the singularity and the event horizon, quantum mechanically the wavefunction has support for a history without any singularity nor event horizon which is mediated by the non-perturbative, quantum tunneling effect. This may be regarded an explicit example that shows the unitarity of an asymptotic observer in AdS, while a classical observer who only follows the most probable history effectively lose information due to the formation of an event horizon.
BibTeX:
@article{sasaki_thin-shell_2014,
  author = {Sasaki, Misao and Yeom, Dong-han},
  title = {Thin-shell bubbles and information loss problem in anti de Sitter background},
  journal = {Journal of High Energy Physics},
  year = {2014},
  volume = {2014},
  number = {12},
  pages = {1--16},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1LS8NAEB5EEATxLcYH7MFDBVOymzSPo5SWIh7Eqtewr0BpicWk_9-ZvKzWQ49hZ2Gzs-w3szPzDYAv-p77507gVfhJU2cLK5JAoE1nOKK_0UGWGJ79ftkGr3vJyOf9NkBZ3dtt6dvTZPTCRQ_hK7hHSMQ7GIGKTvjr9KOLIqB14rV0PptziBxUfy5XxVoQ9F_4qaBmfARtvnGbYtLFnX8q4zdTsLf-hWM4bGxQ9lgfmhPYsfkp7FW5oLo4g2fq5ekWlCHK1EqphS2YzA1rOFZJk2yB2MqaZjQ4gOPljBnLpjOqDmJK6jnVi-TmHN7Ho7fhxG2aLriSmErRn8xsRLGYOJFEX2ZC7Qkde1JKzg1q0NhMKW5U4Jsw0jxMhFVxoFWiudJa-xdwICk5Py-rIj5zCSyRyqLKI4G-ZMADLoXvD1SkooFENIx8By5rTaTLmmIjJTdx4KOt5cBDu3ndIPcReFPat7TZqTQJojhdmsyB3oZ4Jbg2h4tK3IG7dVV34hVlT5iQa0cGsAN8G7Fhw6hOTALl1faruIZ9-qwTZG5gt_xa2duaEvIbN9TzTg}
}
Kolmogorov, A.N. Three approaches to the quantitative definition of information 1965 Problems Inform. Transmission
Vol. 1(1-4), pp. 1-7 
article DOI URL 
BibTeX:
@article{Kolmogorov1965,
  author = {Kolmogorov, A. N.},
  title = {Three approaches to the quantitative definition of information},
  journal = {Problems Inform. Transmission},
  year = {1965},
  volume = {1},
  number = {1-4},
  pages = {1-7},
  url = {http://dx.doi.org/10.1080/00207166808803030},
  doi = {http://doi.org/10.1080/00207166808803030}
}
Vitanyi, P. Three Approaches to the Quantitative Definition of Information in an Individual Pure Quantum State 2000 Proceedings of the 15th Annual IEEE Conference on Computational Complexity, pp. 263-  inproceedings URL 
BibTeX:
@inproceedings{vitanyi_three_2000,
  author = {Vitanyi, Paul},
  title = {Three Approaches to the Quantitative Definition of Information in an Individual Pure Quantum State},
  booktitle = {Proceedings of the 15th Annual IEEE Conference on Computational Complexity},
  publisher = {IEEE Computer Society},
  year = {2000},
  pages = {263--},
  url = {http://dl.acm.org/citation.cfm?id=792765.793427}
}
Müller, M.P. and Masanes, L. Three-dimensionality of space and the quantum bit: an information-theoretic approach 2013 New Journal of Physics
Vol. 15(5), pp. 053040 
article URL 
Abstract: It is sometimes pointed out as a curiosity that the state space of quantum two-level systems, i.e. the qubit, and actual physical space are both three-dimensional and Euclidean. In this paper, we suggest an information-theoretic analysis of this relationship, by proving a particular mathematical result: suppose that physics takes place in d spatial dimensions, and that some events happen probabilistically (not assuming quantum theory in any way). Furthermore, suppose there are systems that carry ‘minimal amounts of direction information’, interacting via some continuous reversible time evolution. We prove that this uniquely determines spatial dimension d = 3 and quantum theory on two qubits (including entanglement and unitary time evolution), and that it allows observers to infer local spatial geometry from probability measurements.
BibTeX:
@article{muller_three-dimensionality_2013,
  author = {Müller, Markus P. and Masanes, Lluís},
  title = {Three-dimensionality of space and the quantum bit: an information-theoretic approach},
  journal = {New Journal of Physics},
  year = {2013},
  volume = {15},
  number = {5},
  pages = {053040},
  url = {http://stacks.iop.org/1367-2630/15/i=5/a=053040}
}
Corda, C. Time dependent Schrödinger equation for black hole evaporation: No information loss 2015 Annals of Physics
Vol. 353, pp. 71-82 
article URL 
Abstract: In 1976 S. Hawking claimed that ". Because part of the information about the state of the system is lost down the hole, the final situation is represented by a density matrix rather than a pure quantum state". 1Verbatim from Ref. [2]. This was the starting point of the popular "black hole (BH) information paradox".In a series of papers, together with collaborators, we naturally interpreted BH quasi-normal modes (QNMs) in terms of quantum levels discussing a model of excited BH somewhat similar to the historical semi-classical Bohr model of the structure of a hydrogen atom. Here we explicitly write down, for the same model, a time dependent Schrödinger equation for the system composed by Hawking radiation and BH QNMs. The physical state and the correspondent wave function are written in terms of a unitary evolution matrix instead of a density matrix. Thus, the final state results to be a pure quantum state instead of a mixed one. Hence, Hawking's claim is falsified because BHs result to be well defined quantum mechanical systems, having ordered, discrete quantum spectra, which respect 't Hooft's assumption that Schrödinger equations can be used universally for all dynamics in the universe. As a consequence, information comes out in BH evaporation in terms of pure states in a unitary time dependent evolution.In Section 4 of this paper we show that the present approach permits also to solve the entanglement problem connected with the information paradox.; In 1976 S. Hawking claimed that "Because part of the information about the state of the system is lost down the hole, the final situation is represented by a density matrix rather than a pure quantum state â[euro]. 1 1 Verbatim from Ref. [2]. This was the starting point of the popular "black hole (BH) information paradoxâ[euro]. In a series of papers, together with collaborators, we naturally interpreted BH quasi-normal modes (QNMs) in terms of quantum levels discussing a model of excited BH somewhat similar to the historical semi-classical Bohr model of the structure of a hydrogen atom. Here we explicitly write down, for the same model, a time dependent Schrödinger equation for the system composed by Hawking radiation and BH QNMs. The physical state and the correspondent wave function are written in terms of a unitary evolution matrix instead of a density matrix. Thus, the final state results to be a pure quantum state instead of a mixed one. Hence, Hawking's claim is falsified because BHs result to be well defined quantum mechanical systems, having ordered, discrete quantum spectra, which respect 't Hooft's assumption that Schrödinger equations can be used universally for all dynamics in the universe. As a consequence, information comes out in BH evaporation in terms of pure states in a unitary time dependent evolution. In Section 4 of this paper we show that the present approach permits also to solve the entanglement problem connected with the information paradox.; In 1976 S. Hawking claimed that "Because part of the information about the state of the system is lost down the hole, the final situation is represented by a density matrix rather than a pure quantum state".(1) This was the starting point of the popular "black hole (BH) information paradox". In a series of papers, together with collaborators, we naturally interpreted BH quasi-normal modes (QNMs) in terms of quantum levels discussing a model of excited BH somewhat similar to the historical semi-classical Bohr model of the structure of a hydrogen atom. Here we explicitly write down, for the same model, a time dependent Schrodinger equation for the system composed by Hawking radiation and BH QNMs. The physical state and the correspondent wave function are written in terms of a unitary evolution matrix instead of a density matrix. Thus, the final state results to be a pure quantum state instead of a mixed one. Hence, Hawking's claim is falsified because BHs result to be well defined quantum mechanical systems, having ordered, discrete quantum spectra, which respect't Hooft's assumption that Schrodinger equations can be used universally for all dynamics in the universe. As a consequence, information comes out in BH evaporation in terms of pure states in a unitary time dependent evolution. In Section 4 of this paper we show that the present approach permits also to solve the entanglement problem connected with the information paradox. (C) 2014 Elsevier Inc. All rights reserved.
BibTeX:
@article{corda_time_2015,
  author = {Corda, Christian},
  title = {Time dependent Schrödinger equation for black hole evaporation: No information loss},
  journal = {Annals of Physics},
  year = {2015},
  volume = {353},
  pages = {71--82},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3JTsMwEB0hJCQkxA4Ni-QDJ6RUSWzHMTe2ihMXyjmKNyFUtYW2fBo_wI9hO3FoSw9wjGRZyYw1M_F78wYAZ90kXooJghpCpeBMe00okxOdS2qMVKwghJjFm-1WdnsZ0PfErGrkdCZT0nXqm15HkhDs2Fy9u5sWQEgJLcKwPGJLoABortphMSXN1ZmuMWQ2WZmSfPrp7UDgYwfaSYtF_3TL_6Zl_-WzdmG7KUnRdX2G9mBND_dhw1ND5eQA-q5LBIVhuVP0JF_evz6Vvw5E-q2WCke29kXCXQYiN3AX6Y9q3ByuK_Q4Qo0-q186sEY4hOfeff_2IW5mMcRVmmcsNhXjTEiGVSZwhhkWQmApEmwjnqLWk5nIGVEk0YwnGDOTp0oTygutCk1tpjyCrcpx9odT39unOoC0LJjihcJKayIpFibVWPOMGlZRbpIILoNfynGtvVEGUtprae1VOnvZf5jS2iuC4-C5Ug0GJfa9obzI0wg6tSPbTXJfqKWYRXAx79p2gQdYXTTMPNR78p8XOYVN-0RrZvcZrE_fZ_q8Fnn8Bspw5Mo}
}
Stampe, D. Toward a Causal Theory of Linguistic Representation 1977 Midwest Studies in Philosophy
Vol. 2(1), pp. 42-63 
article  
BibTeX:
@article{stampe_toward_1977,
  author = {Stampe, Dennis},
  title = {Toward a Causal Theory of Linguistic Representation},
  journal = {Midwest Studies in Philosophy},
  year = {1977},
  volume = {2},
  number = {1},
  pages = {42--63}
}
Caticha, A. Towards an Informational Pragmatic Realism 2014 Minds and Machines
Vol. 24(1), pp. 37-70 
article  
Abstract: I discuss the design of the method of entropic inference as a general framework for reasoning under conditions of uncertainty. The main contribution of this discussion is to emphasize the pragmatic elements in the derivation. More specifically: (1) Probability theory is designed as the uniquely natural tool for representing states of incomplete information. (2) An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. (3) The method of updating from a prior to a posterior probability distribution is designed through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting framework includes as special cases both MaxEnt and Bayes’ rule. It therefore unifies entropic and Bayesian methods into a single general inference scheme. I find that similar pragmatic elements are an integral part of Putnam’s internal realism, of Floridi’s informational structural realism, and also of van Fraasen’s empiricist structuralism. I conclude with the conjecture that their valuable insights can be incorporated into a single coherent doctrine—an informational pragmatic realism.; I discuss the design of the method of entropic inference as a general framework for reasoning under conditions of uncertainty. The main contribution of this discussion is to emphasize the pragmatic elements in the derivation. More specifically: (1) Probability theory is designed as the uniquely natural tool for representing states of incomplete information. (2) An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. (3) The method of updating from a prior to a posterior probability distribution is designed through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting framework includes as special cases both MaxEnt and Bayes' rule. It therefore unifies entropic and Bayesian methods into a single general inference scheme. I find that similar pragmatic elements are an integral part of Putnam's internal realism, of Floridi's informational structural realism, and also of van Fraasen's empiricist structuralism. I conclude with the conjecture that their valuable insights can be incorporated into a single coherent doctrine-an informational pragmatic realism.
BibTeX:
@article{caticha_towards_2014,
  author = {Caticha, Ariel},
  title = {Towards an Informational Pragmatic Realism},
  journal = {Minds and Machines},
  year = {2014},
  volume = {24},
  number = {1},
  pages = {37--70}
}
Hartley, R. Transmission of Information 1928 The Bell System Technical Journal
Vol. 7(3), pp. 535-563School: Lucent, Bell Laboratories 
techreport  
Abstract: A quantitative measure of "information" is developed which is based on physical as contrasted with psychological considerations. How the rate of transmission of this information over a system is limited by the distortion resulting from storage of energy is discussed from the transient viewpoint. The relation between the transient and steady state viewpoints is reviewed. It is shown that when the storage of energy is used to restrict the steady state transmission to a limited range of frequencies the amount of information that can be transmitted is proportional to the product of the width of the frequency-range by the time it is available. Several illustrations of the application of this principle to practical systems are included. In the case of picture transmission and television the spacial variation of intensity is analyzed by a steady state method analogous to that commonly used for variations with time.
BibTeX:
@techreport{Hartley1928,
  author = {Hartley, R.V.L},
  title = {Transmission of Information},
  journal = {The Bell System Technical Journal},
  school = {Lucent, Bell Laboratories},
  year = {1928},
  volume = {7},
  number = {3},
  pages = {535--563}
}
Floridi, L. Trends in the Philosophy of Information 2008 , pp. 113-131  incollection URL 
Armstrong, D. Truth and Truthmakers 2004   book  
BibTeX:
@book{Armstrong2004,
  author = {Armstrong, D.},
  title = {Truth and Truthmakers},
  publisher = {Cambridge University Press},
  year = {2004}
}
Floridi, L. Turing's three philosophical lessons and the philosophy of information 2012 Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
Vol. 370(1971), pp. 3536-3542 
article URL 
Abstract: In this article, I outline the three main philosophical lessons that we may learn from Turing's work, and how they lead to a new philosophy of information. After a brief introduction, I discuss his work on the method of levels of abstraction (LoA), and his insistence that questions could be meaningfully asked only by specifying the correct LoA. I then look at his second lesson, about the sort of philosophical questions that seem to be most pressing today. Finally, I focus on the third lesson, concerning the new philosophical anthropology that owes so much to Turing's work. I then show how the lessons are learned by the philosophy of information. In the conclusion, I draw a general synthesis of the points made, in view of the development of the philosophy of information itself as a continuation of Turing's work.; In this article, I outline the three main philosophical lessons that we may learn from Turing's work, and how they lead to a new philosophy of information. After a brief introduction, I discuss his work on the method of levels of abstraction (LoA), and his insistence that questions could be meaningfully asked only by specifying the correct LoA. I then look at his second lesson, about the sort of philosophical questions that seem to be most pressing today. Finally, I focus on the third lesson, concerning the new philosophical anthropology that owes so much to Turing's work. I then show how the lessons are learned by the philosophy of information. In the conclusion, I draw a general synthesis of the points made, in view of the development of the philosophy of information itself as a continuation of Turing's work.
BibTeX:
@article{floridi_turings_2012,
  author = {Floridi, Luciano},
  title = {Turing's three philosophical lessons and the philosophy of information},
  journal = {Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences},
  year = {2012},
  volume = {370},
  number = {1971},
  pages = {3536--3542},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3NT90wDLfQxiamie0xVgpMyoGJ7dBH89E2OU4PnpCAG0zcojRNT-gNaJHYfz8n_eDxNmncEsU9pHZsJ7Z_BuBsmiYrOqGoXSFsYTNBlZOyylzB0HkuraEG7yjl85ft8fljNaCv5JGv9-iRNznLQv107ls2_KQnY_SA5qHTDuW5SLKUX49gjaufPzNGowVa9wUhD80_TVEwO_MPMBTLDOkmYwz6qUr-73Ts_27nI2z2fij50QnOBNbcYgveLaET4uxihHRttuBNyBX1o0mvDxryrQet_v4J5peh3vGwIS1KhyO3Q4sELwTkBvUpijcxiwrXl1Z_k1816cFbvYhsw9X85HJ2mvQ9GhJLURUkkjmZKuO8K0NrEQDpKyG5RdtYlrVidS0KXsvcqFRUTGSWubKimeXoqFib8s_w3vhc_kUbav6qCF7XePBc5I1hhP87grfX6vxYnp7NuulkmE6bUJg2vWsj5Hw4t0k-LXaACOWYwZumE4ahhlKSGo52VwSoIa5cDIcD3_Vth-qhu2i81J4j2nNEe47EEHViMdIxVvgW7SyGr52cPK3ohulUSyFDD2ehhG4f2xh2Vug4_irfzlHGcLAsYSNBuKjmoXOY92ljoC8hm_Wg7h7MoN198Q73YAPHIQWZyX141d4_uC8dKOUfu5cV-w}
}
Floridi, L., Taddeo, M. and Turilli, M. Turing’s Imitation Game: Still an Impossible Challenge for All Machines and Some Judges––An Evaluation of the 2008 Loebner Contest 2009
Vol. 19(1), pp. 145-150 
article URL 
Abstract: An evaluation of the 2008 Loebner contest
BibTeX:
@article{floridi_turings_2009,
  author = {Floridi, Luciano and Taddeo, Mariarosaria and Turilli, Matteo},
  title = {Turing’s Imitation Game: Still an Impossible Challenge for All Machines and Some Judges––An Evaluation of the 2008 Loebner Contest},
  year = {2009},
  volume = {19},
  number = {1},
  pages = {145--150},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV1Lb9NAEB5VIFVIqKU8TIBKe-GCsOXHZp3lFoWWtpQLiRC3aL0PCZE4oXHvPXLnxN_rL2Fm7U0U2kMr7cHWji1bs_PYnZlvAIo8SeP_dEIlXSp16fpcazNQvO9EJYVRRekM7gL49sk25OuTjPpnEgKUXm9vSt8IcyCm4L1ENRwT6DaaKlrjX8ff1nEEGh5tL-exwL1AiGve9oZty7QdFvXW5ngfQhVMyDJZh543xfE3s7Dv8RdPYK9zRNmwXTkHsGPrp7AfmjywTuafwe-JL2S8vvq7YqfzDtGbfVJz-4GNmx-zGVM1TiwXJFwzy0ahOwtDd5gNcf6Lz9e0KyQ0bLyYW3Z2SdAS11d_cAxrdrRGHGcLx9AjZZQuwc4XtqrxUzyA1qp5DpPjo8noJO7aN8S6lGlsJG1_MzEoc4O7pCwzqEu00JkVymrJtZWFUbq01uWlKrniJs80-RQF-qicFy_gsaIs_7rx1YAmgocORdJGZCYjZEkEu9_l-cfByedRe3sQbpOVL1lLfjURrgQv0bFIypfAXKZMhRZY9B1Bsxk1SF2lndAmVZVxpgfvwiKYLlu8j-kG2Zm4NaW-ncStqejB-8DWDbGn8c09Ow62xEvjevD2BjkFp7pnMrzwtK_uSPcaHrWxLUqueQMPmotLe9jCSf4DVcQDRA}
}
Floridi, L. Two Approaches to the Philosophy of Information 2003 Minds and Machines
Vol. 13(4), pp. 459-469 
article URL 
BibTeX:
@article{floridi_two_2003,
  author = {Floridi, Luciano},
  title = {Two Approaches to the Philosophy of Information},
  journal = {Minds and Machines},
  year = {2003},
  volume = {13},
  number = {4},
  pages = {459--469},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV09T8MwED2hTiyUT1E-JI8saV3bceKxQlSMCCrWKIl9C1JS0SLEv-ec2KW0S6WMPslxHN_z3bt3AFKMebJzJkjvuDG12hhErIzjZCucrcmb8RLl_8g28E0ko_kYxwRld25vS_MI7ZNCUvCucJ0claf0vb69b7II_um09oRKNN0EdqR9tuz38qCde5kPIXKMI61kk2v-q4bfp10fPO1TOAm4k836jXIGR645h2Hs6cDCL34Bk8V3y2ZBaNyt2LplBBHZS-x48MNaZKGGyX_TS1jMnxaPz0loqpCsPMnJER7gqtTaofP-MTWEgbSfTp07zFG6rK4xk7aqxbSsdJWm1mCV55hZgoqlvIJB0zbuGpjCfMqdzbRVTmWmKgkrWJ1rXgtUBDxGMIlrUCx76Qy6b9Aq-NaXqggvXBBiEzwzxdLiCB72LDx5LphNZaHi8JvDh97Ccc_A83GTOxisP7_cfa-6-AuQfr5_}
}
Mitrokhin, Y. Two faces of entropy and information in biological systems 2014 Journal of theoretical biology
Vol. 359, pp. 192-198 
article URL 
Abstract: The article attempts to overcome the well-known paradox of contradictions between the emerging biological organization and entropy production in biological systems. It is assumed that quality, speculative correlation between entropy and antientropy processes taking place both in the past and today in the metabolic and genetic cellular systems may be perfectly authorized for adequate description of the evolution of biological organization. So far as thermodynamic entropy itself cannot compensate for the high degree of organization which exists in the cell, we discuss the mode of conjunction of positive entropy events (mutations) in the genetic systems of the past generations and the formation of organized structures of current cells. We argue that only the information which is generated in the conditions of the information entropy production (mutations and other genome reorganization) in genetic systems of the past generations provides the physical conjunction of entropy and antientropy processes separated from each other in time generations. It is readily apparent from the requirements of the Second law of thermodynamics.; The article attempts to overcome the well-known paradox of contradictions between the emerging biological organization and entropy production in biological systems. It is assumed that quality, speculative correlation between entropy and antientropy processes taking place both in the past and today in the metabolic and genetic cellular systems may be perfectly authorized for adequate description of the evolution of biological organization. So far as thermodynamic entropy itself cannot compensate for the high degree of organization which exists in the cell, we discuss the mode of conjunction of positive entropy events (mutations) in the genetic systems of the past generations and the formation of organized structures of current cells. We argue that only the information which is generated in the conditions of the information entropy production (mutations and other genome reorganization) in genetic systems of the past generations provides the physical conjunction of entropy and antientropy processes separated from each other in time generations. It is readily apparent from the requirements of the Second law of thermodynamics. © 2014 Elsevier Ltd.; The article attempts to overcome the well-known paradox of contradictions between the emerging biological organization and entropy production in biological systems. It is assumed that quality, speculative correlation between entropy and antientropy processes taking place both in the past and today in the metabolic and genetic cellular systems may be perfectly authorized for adequate description of the evolution of biological organization. So far as thermodynamic entropy itself cannot compensate for the high degree of organization which exists in the cell, we discuss the mode of conjunction of positive entropy events (mutations) in the genetic systems of the past generations and the formation of organized structures of current cells. We argue that only the information which is generated in the conditions of the information entropy production (mutations and other genome reorganization) in genetic systems of the past generations provides the physical conjunction of entropy and antientropy processes separated from each other in time generations. It is readily apparent from the requirements of the Second law of thermodynamics.; The article attempts to overcome the well-known paradox of contradictions between the emerging biological organization and entropy production in biological systems. It is assumed that quality, speculative correlation between entropy and antientropy processes taking place both in the past and today in the metabolic and genetic cellular systems may be perfectly authorized for adequate description of the evolution of biological organization. So far as thermodynamic entropy itself cannot compensate for the high degree of organization which exists in the cell, we discuss the mode of conjunction of positive entropy events (mutations) in the genetic systems of the past generations and the formation of organized structures of current cells. We argue that only the information which is generated in the conditions of the information entropy production (mutations and other genome reorganization) in genetic systems of the past generations provides the physical conjunction of entropy and antientropy processes separated from each other in time generations. It is readily apparent from the requirements of the Second law of thermodynamics. (C) 2014 Elsevier Ltd. All rights reserved.
BibTeX:
@article{mitrokhin_two_2014,
  author = {Mitrokhin, Y.},
  title = {Two faces of entropy and information in biological systems},
  journal = {Journal of theoretical biology},
  year = {2014},
  volume = {359},
  pages = {192--198},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3JTsMwEB0hBBISYoeGRfKBGwQl2I0dblCo-IByjhIvhx7aqk2F-vfMxElKoUjllsiOJec5nhnnzRsA_vgQhT_2BKvR8JP2eN6VtD8qa0RUOJ4Y5yTXdvVkG-7_-KFfEbOGGNcTJ0tUwpsxJfoKwYnO1X99WQruiqo8YMVWJy-lTphZP8SKUaq35tYg7VB-yHy21jJVVqh_CA0Jv2GftL-kl0nzv9nZG83uCA5q15Q9-7V0DFt2dAK7vljl4hSeBp9j5ojBxcaO0aHweLJg-ciwWnyVIMZr5nWdCHzmdaJnZ_DRfxv03sO68kKoMWJ6DF3snC4w0su1UKaLUWQkrTNo8LoOHTjrtJGpo8rrQqEDKSV6LQ4dHVloh7Cm_Bz2c2Loj8oqk890gKU8tTJVRuXcCp1GKX6eujCqKDBkTJQM4K7BIJt4pY2soaANM3otGb2WjHh4sQrggmDKaIblNNcZV-g8YiyWYItHrh2FymsnnEcBdDyUbQuXXJACID50-x3ctkOl34NRM8Z5tJICiDfp1qvl1UlWoLz817yuYI_uPF3wGrbL6dzeeInILw4-9V8}
}
Ferguson, T.M. Two paradoxes of semantic information 2015 Synthese
Vol. 192(11), pp. 3719-3730 
article  
Abstract: Issue Title: Special Issue on Ontology & Methodology, guest edited by Benjamin C. Jantzen, Deborah G. Mayo, and Lydia Patton Yehoshua Bar-Hillel and Rudolph Carnap's classical theory of semantic information entails the counterintuitive feature that inconsistent statements convey maximal information. Theories preserving Bar-Hillel and Carnap's modal intuitions while imposing a veridicality requirement on which statements convey information–such as the theories of Fred Dretske or Luciano Floridi–avoid this commitment, as inconsistent statements are deemed not information-conveying by fiat. This paper produces a pair of paradoxical statements that such "veridical-modal" theories must evaluate as both conveying and not conveying information, although Bar-Hillel and Carnap's theory accommodates these statements without inconsistency. Moreover, the paradoxes are independently interesting as the mode in which they self-refer bears on their evaluation.;Yehoshua Bar-Hillel and Rudolph Carnap's classical theory of semantic information entails the counterintuitive feature that inconsistent statements convey maximal information. Theories preserving Bar-Hillel and Carnap's modal intuitions while imposing a veridicality requirement on which statements convey information-such as the theories of Fred Dretske or Luciano Floridi-avoid this commitment, as inconsistent statements are deemed not information-conveying by fiat. This paper produces a pair of paradoxical statements that such "veridical-modal" theories must evaluate as both conveying and not conveying information, although Bar-Hillel and Carnap's theory accommodates these statements without inconsistency. Moreover, the paradoxes are independently interesting as the mode in which they self-refer bears on their evaluation.;Yehoshua Bar-Hillel and Rudolph Carnap’s classical theory of semantic information entails the counterintuitive feature that inconsistent statements convey maximal information. Theories preserving Bar-Hillel and Carnap’s modal intuitions while imposing a veridicality requirement on which statements convey information—such as the theories of Fred Dretske or Luciano Floridi—avoid this commitment, as inconsistent statements are deemed not information-conveying by fiat. This paper produces a pair of paradoxical statements that such “veridical-modal” theories must evaluate as both conveying and not conveying information, although Bar-Hillel and Carnap’s theory accommodates these statements without inconsistency. Moreover, the paradoxes are independently interesting as the mode in which they self-refer bears on their evaluation.;
BibTeX:
@article{ferguson_two_2015,
  author = {Ferguson, Thomas M.},
  title = {Two paradoxes of semantic information},
  journal = {Synthese},
  year = {2015},
  volume = {192},
  number = {11},
  pages = {3719--3730}
}
Rocchi, P. Ubiquity symposium: What is information?: beyond the jungle of information theories 2011 Ubiquity
Vol. 2011(March), pp. 1-9 
article  
Abstract: This fourteenth piece is inspired by a question left over from the Ubiquity Symposium entitled What is Computation? Computing saw the light as a branch of mathematics in the forties, and progressively revealed ever new aspects. Nowadays computer science exhibits many faces [gol97]. Even laymen have become aware of the broad assortment of functions achieved by systems and the prismatic nature of computing challenges [mul98]. The Ubiquity Symposium "What is Computation?" is the most recent collection of commentaries from multiple viewpoints about computing.
BibTeX:
@article{rocchi_ubiquity_2011,
  author = {Rocchi, Paolo},
  title = {Ubiquity symposium: What is information?: beyond the jungle of information theories},
  journal = {Ubiquity},
  year = {2011},
  volume = {2011},
  number = {March},
  pages = {1--9}
}
Kohlas, J. and Eichenberger, C. Uncertain information 2009
Vol. 5363, pp. 128-160 
inproceedings URL 
BibTeX:
@inproceedings{kohlas_uncertain_2009,
  author = {Kohlas, Jurg and Eichenberger, Christian},
  title = {Uncertain information},
  year = {2009},
  volume = {5363},
  pages = {128--160},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwzV1bT8IwFG6MJkZjoiAOvERe9IWMzHbs8uCDgoZENCZC4tvCRgkkuCHbEn–PW3XzQHvPu6WlnL6na-n53xFiOC2oZcwAVNnzHy17xBjQmnHCIzAtI2x6085KyelyHZ2kld-75__8SVGvNH5DOSmwFuU8FyrVnZug5rGjFPOw2CR8nKWmAEHaB3HrbUPH1Y8j0iocRSEOyHUvvby4zySGqwJTz1yi7Y4Yp_xzIOWeqew-x_NFrKqLJVqW8DwIUWVJ5_JSsWZRKM_YQq3EKbIVquErXU45bEKIEcYIrM1jQA5KkDYAmlFIqRMM5TuEAmEAmjvZEm58Nl34kyCNXdQzABhjevQuqsTb4P4dskpqlRFwpZ0rtWx3VvQYv-azIPknob66IO5fZAuAv3SXl-F9RhG2jbJyQDoM4qNLPEzobwoGwYsBKDyYVGqWEL4uNRnELMNomUaFwjQ8ATV8tLQ5ruyuwraoWEVHWc21pQ2VkWHr0oPOD5FNWUBzYIF1ND189Ow29dFg95SKJ94ajDIGToaQwlGmPBSzYmG9qZs2lANOIzGOqeh_U930HP6L11xWcku2zGvJ2x_JxobJT7rdKtt11HTpy5j5NgMcGCbHSfwJ3hsWpZj-A52prbVQDelDmEvxh4kJ1qwHQ9820t-kgaqb-v4-fZHF-ggN9xLtJusUnol1Dl_AYQAdwA}
}
Kohlas, J. Uncertain information: Random variables in graded semilattices 2007 International Journal of Approximate Reasoning
Vol. 46(1), pp. 17-34 
article  
Abstract: Random sets can be considered as random variables with values in a Boolean algebra, in particular in a field of sets. Many properties of random sets, in particular those relative to belief functions, can also be obtained by relaxing the algebraic structure of the domain. In fact, functions monotone to order [infinity], like Choquet capacities, can be defined on semilattices. In this paper random variables with values in some kind of graded semilattices are studied. It is shown that this algebraic structure models important operations regarding information. It turns out that random variables in this algebra form themselves an algebra of the same kind. Their probability distribution corresponds to functions monotone of order [infinity] or to belief functions in the sense of Dempster-Shafer theory of evidence. This paper proposes therefore a natural generalization of evidence theory to a general structure related to probabilistic argumentation systems. Those systems have many interesting models like probabilistic assumption-based reasoning with different kind of logics, systems of linear equations and inequalities with stochastic disturbances, etc. The theory presented leads thus to an interesting and novel way of combining logic and probability.
BibTeX:
@article{kohlas_uncertain_2007,
  author = {Kohlas, Jürg},
  title = {Uncertain information: Random variables in graded semilattices},
  journal = {International Journal of Approximate Reasoning},
  year = {2007},
  volume = {46},
  number = {1},
  pages = {17--34}
}
Rastegin, A.E. Uncertainty and certainty relations for complementary qubit observables in terms of Tsallis’ entropies 2013 Quantum Information Processing
Vol. 12(9), pp. 2947-2963 
article URL 
Abstract: Uncertainty relations for more than two observables have found use in quantum information, though commonly known relations pertain to a pair of observables. We present novel uncertainty and certainty relations of state-independent form for the three Pauli observables with use of the Tsallis $$\textbackslashalpha $$ -entropies. For all real $$\textbackslashalpha ın (0;1]$$ and integer $$\textbackslashalpha \textbackslashge 2$$ , lower bounds on the sum of three $$\textbackslashalpha $$ -entropies are obtained. These bounds are tight in the sense that they are always reached with certain pure states. The necessary and sufficient condition for equality is that the qubit state is an eigenstate of one of the Pauli observables. Using concavity with respect to the parameter $$\textbackslashalpha $$ , we derive approximate lower bounds for non-integer $$\textbackslashalpha ın (1;+ınfty )$$ . In the case of pure states, the developed method also allows to obtain upper bounds on the entropic sum for real $$\textbackslashalpha ın (0;1]$$ and integer $$\textbackslashalpha \textbackslashge 2$$ . For applied purposes, entropic bounds are often used with averaging over the individual entropies. Combining the obtained bounds leads to a band, in which the rescaled average $$\textbackslashalpha $$ -entropy ranges in the pure-state case. A width of this band is essentially dependent on $$\textbackslashalpha $$ . It can be interpreted as an evidence for sensitivity in quantifying the complementarity.;Uncertainty relations for more than two observables have found use in quantum information, though commonly known relations pertain to a pair of observables. We present novel uncertainty and certainty relations of state-independent form for the three Pauli observables with use of the Tsallis [alpha] -entropies. For all real [alpha] [member of] (0 1] and integer [alpha] 2 , lower bounds on the sum of three [alpha] -entropies are obtained. These bounds are tight in the sense that they are always reached with certain pure states. The necessary and sufficient condition for equality is that the qubit state is an eigenstate of one of the Pauli observables. Using concavity with respect to the parameter [alpha] , we derive approximate lower bounds for non-integer [alpha] [member of] (1 +[infinity] ) . In the case of pure states, the developed method also allows to obtain upper bounds on the entropic sum for real [alpha] [member of] (0 1] and integer [alpha] 2 . For applied purposes, entropic bounds are often used with averaging over the individual entropies. Combining the obtained bounds leads to a band, in which the rescaled average [alpha] -entropy ranges in the pure-state case. A width of this band is essentially dependent on [alpha] . It can be interpreted as an evidence for sensitivity in quantifying the complementarity.;
BibTeX:
@article{rastegin_uncertainty_2013,
  author = {Rastegin, Alexey E.},
  title = {Uncertainty and certainty relations for complementary qubit observables in terms of Tsallis’ entropies},
  journal = {Quantum Information Processing},
  year = {2013},
  volume = {12},
  number = {9},
  pages = {2947--2963},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV07T8MwED4hJCQWCuVVHpIHNpQqsZ1HR4SomKFltWzHERVVUpp26Mbf4O-xSzg7SaNSBhhjXSLlbN99d-fZB8Bo3-d-2ATpqwH3leEyNVL6kcZlhs4o1ZppRAThZmYb6DqTkb-1mwKls9vt0TcECpaGZdlnUeKt0Aqjq7Kkvqfnl3XIFUSu12MQ2vYqcRg2dc3fvtAa5c2SqPM0ww40J2Aahsm67NwejN9mYP-jDw7hoAah5K5aNUewY-IudJoGD6Te713Yc-xQXR7D6xgHHHlgsSIyT0n7NG-odATxL3EU9YqRPl-R96WaLEihXOpXTU1JJjmxzqAkRUZGpZxOJ-XXxyexOeZihlH7CYyHD6P7R69u0uBJe3ubx3nsKx0zBAYmCzXCycBgTMOYjLTPM5NkWlNfB4FiCbcLgMeJSVUQJ1JKg6OnsJsXuTkHorLIp4bxSOMXUERRSVMZy1SxjJuU9uDMTpCwW28xl1pgeM0HiLF4D24bHYtZdUuHcOoViG-YqNUpUM1ilmY9uNmSdnLVKwEVAyt68TexS9inbhIt7-wKdhfzpbmuLnX8Bq5j4tg}
}
Klir, G.J. Uncertainty and information: foundations of generalized information theory 2006   book  
BibTeX:
@book{klir_uncertainty_2006,
  author = {Klir, George J.},
  title = {Uncertainty and information: foundations of generalized information theory},
  publisher = {Wiley-Interscience},
  year = {2006}
}
Orlitsky, A., Santhanam, N.P. and Zhang, J. Universal compression of memoryless sources over unknown alphabets 2004 IEEE Transactions on Information Theory
Vol. 50(7), pp. 1469-1481 
article  
Abstract: It has long been known that the compression redundancy of independent and identically distributed (i.i.d.) strings increases to infinity as the alphabet size grows. It is also apparent that any string can be described by separately conveying its symbols, and its pattern–the order in which the symbols appear. Concentrating on the latter, we show that the patterns of i.i.d. strings over all, including infinite and even unknown, alphabets, can be compressed with diminishing redundancy, both in block and sequentially, and that the compression can be performed in linear time.;It has long been known that the compression redundancy of independent and identically distributed (i.i.d.) strings increases to infinity as the alphabet size grows. It is also apparent that any string can be described by separately conveying its symbols, and its pattern-the order in which the symbols appear. Concentrating on the latter, we show that the patterns of i.i.d. strings over all, including infinite and even unknown, alphabets, can be compressed with diminishing redundancy, both in block and sequentially, and that the compression can be performed in linear time. To establish these results, we show that the number of patterns is the Bell number, that the number of patterns with a given number of symbols is the Stirling number of the second kind, and that the redundancy of patterns can be bounded using results of Hardy and Ramanujan on the number of integer partitions. The results also imply an asymptotically optimal solution for the Good-Turing probability-estimation problem.; It has long been known that the compression redundancy of independent and identically distributed (i.i.d.) strings increases to infinity as the alphabet size grows. It is also apparent that any string can be described by separately conveying its symbols, and its pattern-the order in which the symbols appear. Concentrating on the latter, we show that the patterns of i.i.d. strings over all, including infinite and even unknown, alphabets, can be compressed with diminishing redundancy, both in block and sequentially, and that the compression can be performed in linear time. To establish these results, we show that the number of patterns is the Bell number, that the number of patterns with a given number of symbols is the Stirling number of the second kind, and that the redundancy of patterns can be bounded using results of Hardy and Ramanujan on the number of integer partitions. The results also imply an asymptotically optimal solution for the Good-Turing probability-estimation problem. [PUBLICATION ABSTRACT]; It has long been known that the compression redundancy of independent and identically distributed (i.i.d.) strings increases to infinity as the alphabet size grows. It is also apparent that any string can be described by separately conveying its symbols, and its pattern-the order in which the symbols appear. Concentrating on the latter, we show that the patterns of i.i.d. strings over all, including infinite and even unknown, alphabets, can be compressed with diminishing redundancy, both in block and sequentially, and that the compression can be performed in linear time. To establish these results, we show that the number of patterns is the Bell number, that the number of patterns with a given number of symbols is the Stirling number of the second kind, and that the redundancy of patterns can be bounded using results of Hardy and Ramanujan on the number of integer partitions. The results also imply an asymptotically optimal solution for the Good-Turing probability-estimation problem. [PUBLICATION ABSTRACT];
BibTeX:
@article{orlitsky_universal_2004,
  author = {Orlitsky, A. and Santhanam, N. P. and Zhang, Junan},
  title = {Universal compression of memoryless sources over unknown alphabets},
  journal = {IEEE Transactions on Information Theory},
  year = {2004},
  volume = {50},
  number = {7},
  pages = {1469--1481}
}
Davies, P. Universe from bit 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 65-91  incollection  
BibTeX:
@incollection{davies_universe_2010,
  author = {Davies, Paul},
  title = {Universe from bit},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {65--91},
  note = {DOI: 10.1017/CBO9780511778759.004}
}
Clayton, P. Unsolved dilemmas: the concept of matter in the history of philosophy and in contemporary physics 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 38-62  incollection  
BibTeX:
@incollection{clayton_unsolved_2010,
  author = {Clayton, Philip},
  title = {Unsolved dilemmas: the concept of matter in the history of philosophy and in contemporary physics},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {38--62},
  note = {DOI: 10.1017/CBO9780511778759.003}
}
Liu, K., Yin, X., Fan, X. and He, Q. Virtual assembly with physical information: a review 2015 Assembly Automation
Vol. 35(3), pp. 206-220 
article URL 
Abstract: Purpose - The purpose of this paper is to give a comprehensive survey on the physics-based virtual assembly (PBVA) technology in a novel perspective, to analyze current drawbacks and propose several promising future directions. Design/methodology/approach - To provide a deep insight of PBVA, a discussion of the developing context of PBVA and a comparison against constraint-based virtual assembly (CBVA) is put forward. The core elements and general structure are analyzed based on typical PBVA systems. Some common key issues as well as common drawbacks are discussed, based on which the research trend and several promising future directions are proposed. Findings - Special attention is paid to new research progresses and new ideas concerning recent development as well as new typical systems of the technology. Advantages of PBVA over CBVA are investigated. Based on the analysis of typical PBVA systems and the evolution of PBVA, the core elements of the technology and the general structure of its implementation are identified. Then, current PBVA systems are summarized and classified. After that, key issues in the technology and current drawbacks are explored in detail. Finally, promising future directions are given, including both the further perfecting of the technology and the combination with other technologies. Originality/value - The PBVA technology is put into a detailed review and analysis in a novel way, providing a better insight of both the theory and the implementation of the technology.; Purpose – The purpose of this paper is to give a comprehensive survey on the physics-based virtual assembly (PBVA) technology in a novel perspective, to analyze current drawbacks and propose several promising future directions. Design/methodology/approach – To provide a deep insight of PBVA, a discussion of the developing context of PBVA and a comparison against constraint-based virtual assembly (CBVA) is put forward. The core elements and general structure are analyzed based on typical PBVA systems. Some common key issues as well as common drawbacks are discussed, based on which the research trend and several promising future directions are proposed. Findings – Special attention is paid to new research progresses and new ideas concerning recent development as well as new typical systems of the technology. Advantages of PBVA over CBVA are investigated. Based on the analysis of typical PBVA systems and the evolution of PBVA, the core elements of the technology and the general structure of its implementation are identified. Then, current PBVA systems are summarized and classified. After that, key issues in the technology and current drawbacks are explored in detail. Finally, promising future directions are given, including both the further perfecting of the technology and the combination with other technologies. Originality/value – The PBVA technology is put into a detailed review and analysis in a novel way, providing a better insight of both the theory and the implementation of the technology.
BibTeX:
@article{liu_virtual_2015,
  author = {Liu, Keyan and Yin, Xuyue and Fan, Xiumin and He, Qichang},
  title = {Virtual assembly with physical information: a review},
  journal = {Assembly Automation},
  year = {2015},
  volume = {35},
  number = {3},
  pages = {206--220},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV07T8MwED4hJhDi_SgPyRNbII4fSdgqRMXAWLFGTmxLlUpVlTLw77lznLRQBtiyJFJ8yX13vu_7DCCyuzT5kRMUorI1QlM-tNzXFpGfCDauLHNjg6fr2s52rxELJMt2c2Yye6duNSTvyF6_Hw5pco8QJhNEQszDXPIiCH5f-kkCdTQtjVEmiN0yGjdu3r4hyV2l5YA1owPoSMEdx6QfPK-k8b9wsP_zDoewHwtRNmy_nCPYcrNj2F2zJzwB-TpZkLyEYYHt3urpJ6NdWzaPoWXRdJVC-8AMa1UwpzAePY0fn5N4ykLiNMf-UcjU2EZr5XTJ60zoGsFM-oxOSkiNy7HlcN7l1jcmw4B7bTHuuZYOC0PllTiDPUNk_NkyiPbsBTCvndJlWasCm8emsbVutKSf3VBFZ80AbuOKVPPWU6MKvUhaVN-XYwDnXSQqO51WAusbmgVyfvnXR1zBDl6qwNkT17C9XHy4m9Zz8QsURr0H}
}
Mathur, S.D. What Exactly is the Information Paradox? 2009
Vol. 769, pp. 3-48 
incollection URL 
Abstract: The black hole information paradox tells us something important about the way quantum mechanics and gravity fit together. In these lectures I try to give a pedagogical review of the essential physics leading to the paradox, using mostly pictures. Hawking’s argument is recast as a ‘theorem’: if quantum gravity effects are confined to within a given length scale and the vacuum is assumed to be unique, then there will be information loss. We conclude with a brief summary of how quantum effects in string theory violate the first condition and make the interior of the hole a ‘fuzzball’.
BibTeX:
@incollection{mathur_what_2009,
  author = {Mathur, S. D.},
  title = {What Exactly is the Information Paradox?},
  publisher = {Springer Berlin Heidelberg},
  year = {2009},
  volume = {769},
  pages = {3--48},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwlV3fS8MwED5EGIiCOpXNH5AXxZdq2qRN-iSiGz74MEGfS9KlIOg21g62_967du1w28veUpoGeqG5u-_u-wogggfurZ0JRnCrlcwkOjgTOt-3hMpH2lqB_seq_8h2wxEre9_rAmV5btfUt1VVX6MHxQwooRQIY39q63p__WhwFmLpSKGWtA6cHMZxq0E8ymveqBFtXZdERNPxZJZvFEtLH9Q_hrpZuO49aQrSK8r8Zm_2Tu92AofEfWBESkB7n8KeG7Wh85wTXj7-XbA7Vo4rOCRvQ2tQjc7gnjTAWW9u0uJnwb5zhoElW3KdaO_ZwEzNcDx_Ooevfu_z5c1b_oXBMz7GC14qSGZKxoHF7xX3DxM0x7MoDqIs0mEqlHQ8zlSsXKSFdXKIIYALrXGpM5oPQ3EBR4a69UdFyeobdoA5FQc8jaXRLpMZrhlmhnLPNLQ6VUp04bYyeTKpNDeSIMmDhCehwozGjzAnUkkxL7rQWZuH8aQiibaoC4-1cZubWwyrScEwIbP6lzs_cQUHVXGJEJlr2C-mM3dT6Tn-AQ_n05k}
}
Millikan, R.G. What has natural information to do with intentional representation? 2001 Philosophy(supplemet 49), pp. 105  article URL 
Abstract: Millikan discusses Fred Dretske's "Knowledge and the Flow of Information" and argues that what an animal needs to know about its environment is not available as natural information of the intentional representation kind. She proposes a softer view of natural information that is at least hinted at by Dretske and shows that it does not have verificationist consequences.
BibTeX:
@article{millikan_what_2001,
  author = {Millikan, Ruth G.},
  title = {What has natural information to do with intentional representation?},
  journal = {Philosophy},
  year = {2001},
  number = {supplemet 49},
  pages = {105},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV07T8MwED4BlVAlBBTaUB6S_0BCHTtOMlVQqCrBwNChW-T4AUPV0jYM_Ht8eRQB6oKUxVGG6M6-l7_7DoCFwcD_ZROMTlXKcyZD5XyiVYbplFnKqOV4Jyl-VrahYYWotd0YydJy66XCovmtC0ywy3Mghu8rH6dI4W1rPVJjH1oUu1Ld_o5n91vL7Lx1xaHJqI-Jyi4zXPqW8Qk0YKcGU7K9aP5uhf-Luf7vP5_CcR2Dkrtq03RgzyzOoP3SDDX4PIcRMnqTN7khJfGn-7gmWEU1kmJJ9JJgBZcg20SNWSclP2bTy7QYdmE6fpyOJn49bsF_5Uz4ERL_RFRS7VyUpDZ1Jzu1oYvnEi6FVZqKRMnEeTct88hozQ0Xwkoe61DnQrEeHElE5S-KsntPe9Cy7ggZD92a54TqweEsfX5IJk-jatlplsGmbDELVoXndFWKwxdBfAGERyyPDYtykSQ8l1GShlanuQu2raEi5H3ooQozlEKxlipDwrIojuM-dBsVZHo-zzCvc7mmuNzx_graFbwMn2s4KNYf5qbiZvwCErvRtA}
}
Calude, C.S., Chaitin, G.J., Fredkin, E., Leggett, A.J., de Ruyter, R., Toffoli, T. and Wolfram, S. What is computation? (How) does nature compute? 2012   incollection  
BibTeX:
@incollection{calude_what_2012,
  author = {Calude, Cristian S. and Chaitin, Gregory J. and Fredkin, Edward and Leggett, Anthony J. and de Ruyter, Rob and Toffoli, Tommaso and Wolfram, Stephen},
  title = {What is computation? (How) does nature compute?},
  year = {2012}
}
Adami, C. What is information? 2016 PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES
Vol. 374(2063) 
article  
Abstract: Information is a precise concept that can be defined mathematically, but its relationship to what we call 'knowledge' is not always made clear. Furthermore, the concepts 'entropy' and 'information', while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.;Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.;Information is a precise concept that can be defined mathematically, but its relationship to what we call 'knowledge' is not always made clear. Furthermore, the concepts 'entropy' and 'information', while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.;
BibTeX:
@article{adami_what_2016,
  author = {Adami, C.},
  title = {What is information?},
  journal = {PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES},
  year = {2016},
  volume = {374},
  number = {2063}
}
Barbieri, M. What is Information? 2012 Biosemiotics
Vol. 5(2), pp. 147-152 
article  
BibTeX:
@article{barbieri_what_2012,
  author = {Barbieri, Marcello},
  title = {What is Information?},
  journal = {Biosemiotics},
  year = {2012},
  volume = {5},
  number = {2},
  pages = {147--152}
}
Lombardi, O. What is Information? 2004 Foundations of Science
Vol. 9(2), pp. 105-134 
article  
Abstract: The main aim of this work is to contribute tothe elucidation of the concept of informationby comparing three different views about thismatter: the view of Fred Dretske's semantictheory of information, the perspective adoptedby Peter Kosso in his interaction-informationaccount of scientific observation, and thesyntactic approach of Thomas Cover and JoyThomas. We will see that these views involvevery different concepts of information, eachone useful in its own field of application. This comparison will allow us to argue in favorof a terminological `cleansing': it is necessaryto make a terminological distinction among thedifferent concepts of information, in order toavoid conceptual confusions when the word`information' is used to elucidate relatedconcepts as knowledge, observation orentropy.
BibTeX:
@article{lombardi_what_2004,
  author = {Lombardi, Olimpia},
  title = {What is Information?},
  journal = {Foundations of Science},
  year = {2004},
  volume = {9},
  number = {2},
  pages = {105--134}
}
Rowley, J. What is information? 1998 Information Services & Use
Vol. 18(4), pp. 243 
article  
BibTeX:
@article{rowley_what_1998,
  author = {Rowley, Jennifer},
  title = {What is information?},
  journal = {Information Services & Use},
  year = {1998},
  volume = {18},
  number = {4},
  pages = {243}
}
Timpson, C.G. What is Information? 2013   incollection  
Abstract: Distinctions are drawn between a number of different information concepts. It is noted that ‘information’ in both the everyday and Shannon-theory setting is an abstract noun, though derived in different ways. A general definition of the concept(s) of information in the Shannon mould is provided and it is shown that a concept of both bits (how much) and pieces (what) of Shannon information is available. It is emphasised that the Shannon information, as a measure of information, should not be understood as an uncertainty; neither is the notion of correlation key to the Shannon concept. Corollaries regarding the ontological status of information and on the notion of information’s flow are drawn. The chapter closes with a brief discussion of Dretske’s attempt to base a semantic notion of information on Shannon’s theory. It is argued that the attempt is not successful.
BibTeX:
@incollection{timpson_what_2013,
  author = {Timpson, Christopher G.},
  title = {What is Information?},
  publisher = {Oxford University Press},
  year = {2013}
}
Harms, W.F. What Is Information? Three Concepts 2006 Biological Theory
Vol. 1(3), pp. 230-242 
article  
Abstract: The concept of information tempts us as a theoretical primitive, partly because of the respectability lent to it by highly successful applications of Shannon’s information theory, partly because of its broad range of applicability in various domains, partly because of its neutrality with respect to what basic sorts of things there are. This versatility, however, is the very reason why information cannot be the theoretical primitive we might like it to be. “Information,” as it is variously used, is systematically ambiguous between whether it involves continuous or discrete quantities, causal or noncausal relationships, and intrinsic or relational properties. Many uses can be firmly grounded in existing theory, however. Continuous quantities of information involving probabilities can be related to information theory proper. Information defined relative to systems of rules or conventions can be understood relative to the theory of meaning (semantics). A number of causal notions may possibly be located relative to standard notions in physics. Precise specification of these distinct properties involved in the common notion of information can allow us to map the relationships between them. Consequently, while information is not in itself the kind of single thing that can play a significant unifying role, analyzing its ambiguities may facilitate headway toward that goal.
BibTeX:
@article{harms_what_2006,
  author = {Harms, William F.},
  title = {What Is Information? Three Concepts},
  journal = {Biological Theory},
  year = {2006},
  volume = {1},
  number = {3},
  pages = {230--242}
}
Deacon, T.W. What is missing from theories of information? 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 146-169  incollection  
BibTeX:
@incollection{deacon_what_2010,
  author = {Deacon, Terrence W.},
  title = {What is missing from theories of information?},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {146--169},
  note = {DOI: 10.1017/CBO9780511778759.008}
}
Lombardi, O., Holik, F. and Vanni, L. What is Shannon information? 2016 Synthese
Vol. 193(7), pp. 1983-2012 
article  
Abstract: Despite of its formal precision and its great many applications, Shannon's theory still offers an active terrain of debate when the interpretation of its main concepts is the task at issue. In this article we try to analyze certain points that still remain obscure or matter of discussion, and whose elucidation contribute to the assessment of the different interpretative proposals about the concept of information. In particular, we argue for a pluralist position, according to which the different views about information are no longer rival, but different interpretations of a single formal concept.;Despite of its formal precision and its great many applications, Shannon's theory still offers an active terrain of debate when the interpretation of its main concepts is the task at issue. In this article we try to analyze certain points that still remain obscure or matter of discussion, and whose elucidation contribute to the assessment of the different interpretative proposals about the concept of information. In particular, we argue for a pluralist position, according to which the different views about information are no longer rival, but different interpretations of a single formal concept.;Despite of its formal precision and its great many applications, Shannon’s theory still offers an active terrain of debate when the interpretation of its main concepts is the task at issue. In this article we try to analyze certain points that still remain obscure or matter of discussion, and whose elucidation contribute to the assessment of the different interpretative proposals about the concept of information. In particular, we argue for a pluralist position, according to which the different views about information are no longer rival, but different interpretations of a single formal concept.;
BibTeX:
@article{lombardi_what_2016,
  author = {Lombardi, Olimpia and Holik, Federico and Vanni, Leonardo},
  title = {What is Shannon information?},
  journal = {Synthese},
  year = {2016},
  volume = {193},
  number = {7},
  pages = {1983--2012}
}
Hamame, C.M., Cosmelli, D. and Aboitiz, F. What is so informative about information? 2007 Behavioral and Brain Sciences
Vol. 30(4), pp. 371-372 
article  
Abstract: Understanding evolution beyond a gene-centered vision is a fertile ground for new questions and approaches. However, in this systemic perspective, we take issue with the necessity of the concept of information. Through the example of brain and language evolution, we propose the autonomous systems theory as a more biologically relevant framework for the evolutionary perspective offered by Jablonka & Lamb (J&L). [PUBLICATION ABSTRACT]
BibTeX:
@article{hamame_what_2007,
  author = {Hamame, Carlos M. and Cosmelli, Diego and Aboitiz, Francisco},
  title = {What is so informative about information?},
  journal = {Behavioral and Brain Sciences},
  year = {2007},
  volume = {30},
  number = {4},
  pages = {371--372}
}
Welker, M. What is the ‘spiritual body’? On what may be regarded as ‘ultimate’ in the interrelation between God, matter, and information 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 349-364  incollection  
BibTeX:
@incollection{welker_what_2010,
  author = {Welker, Michael},
  title = {What is the ‘spiritual body’? On what may be regarded as ‘ultimate’ in the interrelation between God, matter, and information},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {349--364},
  note = {DOI: 10.1017/CBO9780511778759.016}
}
Shea, N. What's transmitted? Inherited information 2011 Biology and Philosophy
Vol. 26(2), pp. 183-189 
article URL 
Abstract: Commentary on Bergstrom and Rosvall, 'The transmission sense of information', Biology and Philosophy. In response to worries that uses of the concept of information in biology are metaphorical or insubstantial, Bergstrom and Rosvall have identified a sense in which DNA transmits information down the generations. Their 'transmission view of information' is founded on a claim about DNA's teleofunction. Bergstrom and Rosvall see their transmission view of information as a rival to semantic accounts. This commentary argues that it is complementary. The idea that DNA is transmitting information down the generations only makes sense if it is carrying a message, that is to say if it has semantic content[PUBLICATION ABSTRACT]; Commentary on Bergstrom and Rosvall, 'The transmission sense of information', Biology and Philosophy. In response to worries that uses of the concept of information in biology are metaphorical or insubstantial, Bergstrom and Rosvall have identified a sense in which DNA transmits information down the generations. Their 'transmission view of information' is founded on a claim about DNA's teleofunction. Bergstrom and Rosvall see their transmission view of information as a rival to semantic accounts. This commentary argues that it is complementary. The idea that DNA is transmitting information down the generations only makes sense if it is carrying a message, that is to say if it has semantic content.; Commentary on Bergstrom and Rosvall, 'The transmission sense of information', Biology and Philosophy. In response to worries that uses of the concept of information in biology are metaphorical or insubstantial, Bergstrom and Rosvall have identified a sense in which DNA transmits information down the generations. Their 'transmission view of information' is founded on a claim about DNA's teleofunction. Bergstrom and Rosvall see their transmission view of information as a rival to semantic accounts. This commentary argues that it is complementary. The idea that DNA is transmitting information down the generations only makes sense if it is carrying a message, that is to say if it has semantic content Keywords Genetic information * Genetic representation * Inheritance systems * Information transmission * Natural selection * Evolution * Entropy; Commentary on Bergstrom and Rosvall, 'The transmission sense of information', Biology and Philosophy. In response to worries that uses of the concept of information in biology are metaphorical or insubstantial, Bergstrom and Rosvall have identified a sense in which DNA transmits information down the generations. Their 'transmission view of information' is founded on a claim about DNA's teleofunction. Bergstrom and Rosvall see their transmission view of information as a rival to semantic accounts. This commentary argues that it is complementary. The idea that DNA is transmitting information down the generations only makes sense if it is carrying a message, that is to say if it has semantic content
BibTeX:
@article{shea_whats_2011,
  author = {Shea, Nicholas},
  title = {What's transmitted? Inherited information},
  journal = {Biology and Philosophy},
  year = {2011},
  volume = {26},
  number = {2},
  pages = {183--189},
  url = {http://usyd.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMw3V1bb9MwFLa4CDQJITYglJvywEU8pHLsxHUeEEKFqROTVtEN8WY1toMmbWVbPYn9e86x4zTteNgzj0msKPHnc7XPdwjhbEizDZ1gkPeJGVnVqDORFIvm0jCmmeGN8MRM_cx27IS3uvc_AI9k3D4L79AMnR47zGfyXdAEWOqHDmZLltpBEvd09_ymkl8M08ne_sHsYDrpPF1sfb3aujGrxCfv5wni6Yp-LlFUGZehG0ZUhqF8vQWd9TRbHvrNtEYyD31_rulfGuuRS47HsGgG_iNry3jWuK43bFB3MhA1dEVBy3C0pPwdEqCfmmPtPtpFdjS7DdG1xBj7–xHZ3DRpQsU7uF_4uZ1qJBc_5Lrlhc5YvXvs8vlP10P72YcPiIP2_gg_Rxw3Sa37GKH3AsdQ692yNY0tp64ekw-INTvl2kP6E9pB3Pag_kJOdr9ejieZG3ri-wXNiLJdA1hOhgAJGgsjK1yK7mmICmi0VLUFZ0XueGF1CBCBYTQBbNzPm8sFxb8ZVvyp-TBHEskFs6XUpqE3G1gWdsEfYwEfish939W-1_k5Ns4XG7Hy-HS1_sNz10CM-ylIhPD0TOSlppyI0vsTceKUtt6BM5-XVtZi7JhVAzIa5xcFUp4OxFSHAwH9rWhA5L4ATgBMDd67UkERJmTE8U4IMkhuMgH5G3AR50F_hXF1JIpqpDKENxniGEK5f44eMPGOIglaDES8Flv-sB2zzfW2oDkNxk2brnzkTPCPb_Zq1-QrZVIviR33MWlfRWIP_8C1iqLtA}
}
Einstein, A. Zur Elektrodynamik bewegter Körper. (German) [On the electrodynamics of moving bodies] 1905 Annalen der Physik
Vol. 322(10), pp. 891-921 
article DOI  
BibTeX:
@article{einstein,
  author = {Albert Einstein},
  title = {Zur Elektrodynamik bewegter Körper. (German) [On the electrodynamics of moving bodies]},
  journal = {Annalen der Physik},
  year = {1905},
  volume = {322},
  number = {10},
  pages = {891--921},
  doi = {http://doi.org/10.1002/andp.19053221004}
}
Created by JabRef on 10/07/2017.