Search This Blog

Bib: Information, SR, NOSR, QFT, Physics Pt 1.

Matching entries: 0
settings...
AuthorTitleYearJournal/ProceedingsReftypeDOI/URL
Bawden, D. and Robinson, L. "Deep down things": in what ways is information physical, and why does it matter for information science? 2013 INFORMATION RESEARCH-AN INTERNATIONAL ELECTRONIC JOURNAL
Vol. 18(3) 
article  
Abstract: Introduction. Rolf Landauer declared in 1991 that 'information is physical'. Since then, information has come to be seen by many physicists as a fundamental component of the physical world; indeed by some as the physical component. This idea is now gaining currency in popular science communication. However, it is often far from clear what exactly this statement means; exactly how is information physical? And why this should matter for information science? The purpose of this paper is to clarify just what is meant by the physical nature of information, and the significance of these considerations for our discipline. Methods. A selective literature review and conceptual analysis, based on literature from both physical science and information science. Results. The prospect of attempting to make links between objective and subjective conceptions of information has been strongly advocated by some authors and doubted by others. The physical nature of information can be understood from three main perspectives: the relation between information and physical entropy; the strongly informational nature of the quantum view of nature; and the possibility of recasting physical laws in informational terms. Conclusions. Based on this analysis, we muse on the relevance of such issues to information science, with particular reference to emergent properties of information. Apart from the added public awareness of the i-word in a very different context from the norm, it may that that there are general laws and principles, or at least useful metaphors and analogies, linking the concept of information in the physical, biological and social domains.
BibTeX:
@article{bawden-deep-2013,
  author = {Bawden, D. and Robinson, L.},
  title = {"Deep down things": in what ways is information physical, and why does it matter for information science?},
  journal = {INFORMATION RESEARCH-AN INTERNATIONAL ELECTRONIC JOURNAL},
  year = {2013},
  volume = {18},
  number = {3}
}
Shannon, C.E., Sloane, N.J.A., Wyner, A.D. and Society, I.I.T. "Information Theory" from Encyclopedia Britannica 14th edition 1968 1993 Claude Elwood Shannon: collected papers, pp. 212-220  incollection  
BibTeX:
@incollection{shannon-information-1993,
  author = {Shannon, Claude E. and Sloane, N. J. A. and Wyner, A. D. and Society, IEEE Information Theory},
  title = {"Information Theory" from Encyclopedia Britannica 14th edition 1968},
  booktitle = {Claude Elwood Shannon: collected papers},
  publisher = {IEEE Press},
  year = {1993},
  pages = {212--220}
}
Dennett, D. "LUCK, REGRET, AND KINDS OF PERSONS" Reply to Slote and Rovane 1994 Philosophical Topics
Vol. 22(1), pp. 558 
article  
BibTeX:
@article{dennett-luck-1994,
  author = {Dennett, Daniel},
  title = {"LUCK, REGRET, AND KINDS OF PERSONS" Reply to Slote and Rovane},
  journal = {Philosophical Topics},
  year = {1994},
  volume = {22},
  number = {1},
  pages = {558}
}
Hájek, A. "Mises Redux" – Redux: Fifteen Arguments against Finite Frequentism 1996 Erkenntnis (1975-)
Vol. 45(2/3), pp. 209-227 
article  
Abstract: According to finite frequentism, the probability of an attribute A in a finite reference class B is the relative frequency of actual occurrences of A within B. I present fifteen arguments against this position.
BibTeX:
@article{hajek-mises-1996,
  author = {Hájek, Alan},
  title = {"Mises Redux" – Redux: Fifteen Arguments against Finite Frequentism},
  journal = {Erkenntnis (1975-)},
  year = {1996},
  volume = {45},
  number = {2/3},
  pages = {209--227}
}
Teller, P. "Saving the phenomena" today 2010 Philosophy of Science
Vol. 77(5), pp. 815 
article  
Abstract: Bogen and Woodward argued the indirect connection between data and theory in terms of their conception of "phenomena." I outline and elaborate on their presentation. To illuminate the connection with contemporary thinking in terms of models, I distinguish between phenomena tokens, representations of which can be identified with data models, and phenomena types that can be identified with relatively low-lying models or aspects of models in the model hierarchy. Throughout I stress the role of idealization in these considerations. [PUBLICATION ABSTRACT]
BibTeX:
@article{teller-saving-2010,
  author = {Teller, Paul},
  title = {"Saving the phenomena" today},
  journal = {Philosophy of Science},
  year = {2010},
  volume = {77},
  number = {5},
  pages = {815}
}
Hume, D. (Of the) standard of taste 1990   book  
BibTeX:
@book{hume--1990,
  author = {Hume, David},
  title = {(Of the) standard of taste},
  publisher = {BiblioBytes},
  year = {1990}
}
Goldman, A.I. A Causal Theory of Knowing 1967 The Journal of Philosophy
Vol. 64(12), pp. 357-372 
article  
BibTeX:
@article{goldman-causal-1967,
  author = {Goldman, Alvin I.},
  title = {A Causal Theory of Knowing},
  journal = {The Journal of Philosophy},
  year = {1967},
  volume = {64},
  number = {12},
  pages = {357--372}
}
Baars, B.J. A Cognitive Theory of Consciousness 1988   book URL 
BibTeX:
@book{baars-cognitive-1988,
  author = {Baars, B. J.},
  title = {A Cognitive Theory of Consciousness},
  publisher = {Cambridge University Press},
  year = {1988},
  url = {http://bernardbaars.pbworks.com/f/++++Functions+of+Consciousness.pdf}
}
Armstrong, D.M. A combinatorial theory of possibility 1989   book  
BibTeX:
@book{armstrong-combinatorial-1989,
  author = {Armstrong, D. M.},
  title = {A combinatorial theory of possibility},
  publisher = {Cambridge University Press},
  year = {1989}
}
Cohen, Y.E. and Andersen, R.A. A common reference frame for movement plans in the posterior parietal cortex 2002 Nat Rev Neurosci
Vol. 3 
article DOI URL 
BibTeX:
@article{cohen-common-2002,
  author = {Cohen, Y. E. and Andersen, R. A.},
  title = {A common reference frame for movement plans in the posterior parietal cortex},
  journal = {Nat Rev Neurosci},
  year = {2002},
  volume = {3},
  url = {http://dx.doi.org/10.1038/nrn873},
  doi = {http://doi.org/10.1038/nrn873}
}
Kim, J., Sosa, E. and Rosenkrantz, G.S. A companion to metaphysics 2009
Vol. 7 
book  
BibTeX:
@book{kim-companion-2009,
  author = {Kim, Jaegwon and Sosa, Ernest and Rosenkrantz, Gary S.},
  title = {A companion to metaphysics},
  publisher = {Wiley-Blackwell},
  year = {2009},
  volume = {7},
  edition = {2nd}
}
Hughston, L.P., Jozsa, R. and Wootters, W.K. A complete classification of quantum ensembles having a given density matrix 1993 Physics Letters A
Vol. 183(1), pp. 14 - 18 
article DOI URL 
Abstract: A complete constructive classification is given for all discrete ensembles of pure quantum states having a given density matrix. As a special case this provides a classification of positive operator valued measures with finitely many components. We also show that any chosen ensemble consistent with a fixed density matrix ϱ can be created at space-like separation using an entangled state depending only on ϱ.
BibTeX:
@article{hughston-complete-1993,
  author = {Hughston, Lane P. and Jozsa, Richard and Wootters, William K.},
  title = {A complete classification of quantum ensembles having a given density matrix},
  journal = {Physics Letters A},
  year = {1993},
  volume = {183},
  number = {1},
  pages = {14 -- 18},
  url = {http://www.sciencedirect.com/science/article/pii/0375960193908809},
  doi = {http://doi.org/10.1016/0375-9601(93)90880-9}
}
Tononi, G., Sporns, O. and Edelman, G.M. A complexity measure for selective matching of signals by the brain 1996 Proceedings of the National Academy of Sciences of the United States of America
Vol. 93 
article DOI URL 
BibTeX:
@article{tononi-complexity-1996,
  author = {Tononi, G. and Sporns, O. and Edelman, G. M.},
  title = {A complexity measure for selective matching of signals by the brain},
  journal = {Proceedings of the National Academy of Sciences of the United States of America},
  year = {1996},
  volume = {93},
  url = {http://dx.doi.org/10.1073/pnas.93.8.3422},
  doi = {http://doi.org/10.1073/pnas.93.8.3422}
}
Zenil, H. A computable universe: understanding and exploring nature as computation 2013   book  
BibTeX:
@book{zenil-computable-2013,
  author = {Zenil, Hector},
  title = {A computable universe: understanding and exploring nature as computation},
  publisher = {World Scientific},
  year = {2013}
}
Pouget, A., Deneve, S. and Duhamel, J.R. A computational perspective on the neural basis of multisensory spatial representations 2002 Nat Rev Neurosci
Vol. 3 
article DOI URL 
BibTeX:
@article{pouget-computational-2002,
  author = {Pouget, A. and Deneve, S. and Duhamel, J. R.},
  title = {A computational perspective on the neural basis of multisensory spatial representations},
  journal = {Nat Rev Neurosci},
  year = {2002},
  volume = {3},
  url = {http://dx.doi.org/10.1038/nrn914},
  doi = {http://doi.org/10.1038/nrn914}
}
Adriaans, P. A Critical Analysis of Floridi's Theory of Semantic Information 2010 Knowledge, Technology & Policy
Vol. 23(1-2), pp. 1-16 
article  
Abstract: Issue Title: Special Issue: Luciano Floridi's Philosophy of Technology: Critical Reflections / Guest Edited by Hilmi Demir In various publications over the past years, Floridi has developed a theory of semantic information as well-formed, meaningful, and truthful data. This theory is more or less orthogonal to the standard entropy-based notions of information known from physics, information theory, and computer science that all define the amount of information in a certain system as a scalar value without any direct semantic implication. In this context the question rises what the exact relation between these various conceptions of information is and whether there is a real need to enrich these mathematically more or less rigid definitions with a less formal notion of semantic information. I investigate various philosophical aspects of the more formal definitions of information in the light of Floridi's theory. The position I defend is that the formal treatment of the notion of information as a general theory of entropy is one of the fundamental achievements of modern science that in itself is a rich source for new philosophical reflection. This makes information theory a competitor of classical epistemology rather than a servant. In this light Floridi's philosophy of information is more a reprise of classical epistemology that only pays lip service to information theory but fails to address the important central questions of philosophy of information. Specifically, I will defend the view that notions that are associated with truth, knowledge, and meaning all can adequately be reconstructed in the context of modern information theory and that consequently there is no need to introduce a concept of semantic information.[PUBLICATION ABSTRACT]
BibTeX:
@article{adriaans-critical-2010,
  author = {Adriaans, Pieter},
  title = {A Critical Analysis of Floridi's Theory of Semantic Information},
  journal = {Knowledge, Technology & Policy},
  year = {2010},
  volume = {23},
  number = {1-2},
  pages = {1--16}
}
Wei, W. and Jinlong, Y. A Critical Analysis of Structural Realism 2008 Frontiers of Philosophy in China
Vol. 3(2), pp. 294-306 
article  
Abstract: The epistemological version of structural realism, proposed by Cao Tianyu, has great influence in the philosophy of science. Syntheses has published a special volume discussing the topic. Cao criticizes anti-realism, as well as the epistemic and ontic versions of structural realism. From the concepts of structure, ontology, and construction, he analyzes the objectivity of scientific theories as having five aspects: construction, historicity, holism, revision, and revolution. This paper systematically analyzes and comments on Cao's structural realism. The author agrees with his criticism of the under-determination thesis, is neutral to his argument against ontological discontinuity, and questions his universal language argument.曹天予提出的结构实在论知识论版本在科学哲学界有很大影响, 并引发了 «综合» 期刊专刊讨论。 他既批评反实在论的观点, 也对结构实在论的知识版本与本体版本有所批判。 他从结构、 本体、 建构这三个概念出发, 论证了科学理论的客观性体现为建构性、 历史性、 整体性、 修正性、 革命性。 本文对曹天予的结构实在论进行了系统的梳理与分析, 赞同他对 “不充分决定” 论旨的批评, 对他批评 “本体论不连续性” 持中立态度, 但对他的 “普遍语法论证” 有所存疑。;The epistemological version of structural realism, proposed by Cao Tianyu, has great influence in the philosophy of science. Syntheses has published a special volume discussing the topic. Cao criticizes anti-realism, as well as the epistemic and ontic versions of structural realism. From the concepts of structure, ontology, and construction, he analyzes the objectivity of scientific theories as having five aspects: construction, historicity, holism, revision, and revolution. This paper systematically analyzes and comments on Cao's structural realism. The author agrees with his criticism of the under-determination thesis, is neutral to his argument against ontological discontinuity, and questions his universal language argument. /// 曹天予提出的结构实在论知识论版本在科学哲学界有很大影响,并引发了《综 合》期刊专刊讨论。他既批评反实在论的观点,也对结构实在论的知识版本与本体 版本有所批判。他从结构、本体、建构这三个概念出发,论证了科学理论的客观性 体现为建构性、历史性、整体性、修正性、革命性。本文对曹天予的结构实在论进 行了系统的梳理与分析,赞同他对"不充分决定"论旨的批评,对他批评"本体论 不连续性"持中立态度,但对他的"普遍语法论证"有所存疑。;The epistemological version of structural realism, proposed by Cao Tianyu, has great influence in the philosophy of science. Syntheses has published a special volume discussing the topic. Cao criticizes anti-realism, as well as the epistemic and ontic versions of structural realism. From the concepts of structure, ontology, and construction, he analyzes the objectivity of scientific theories as having five aspects: construction, historicity, holism, revision, and revolution. This paper systematically analyzes and comments on Cao's structural realism. The author agrees with his criticism of the under-determination thesis, is neutral to his argument against ontological discontinuity, and questions his universal language argument.; The epistemological version of structural realism, proposed by Cao Tianyu, has great influence in the philosophy of science. Syntheses has published a special volume discussing the topic. Cao criticizes anti-realism, as well as the epistemic and ontic versions of structural realism. From the concepts of structure, ontology, and construction, he analyzes the objectivity of scientific theories as having five aspects: construction, historicity, holism, revision, and revolution. This paper systematically analyzes and comments on Cao's structural realism. The author agrees with his criticism of the under-determination thesis, is neutral to his argument against ontological discontinuity, and questions his universal language argument.[PUBLICATION ABSTRACT];
BibTeX:
@article{wei-critical-2008,
  author = {Wei, Wang and Jinlong, Yu},
  title = {A Critical Analysis of Structural Realism},
  journal = {Frontiers of Philosophy in China},
  year = {2008},
  volume = {3},
  number = {2},
  pages = {294--306}
}
Lam, V. and Esfeld, M. A dilemma for the emergence of spacetime in canonical quantum gravity 2013 Studies in History and Philosophy of Science Part B - Studies in History and Philosophy of Modern Physics
Vol. 44(3), pp. 286-293 
article  
Abstract: The procedures of canonical quantization of the gravitational field apparently lead to entities for which any interpretation in terms of spatio-temporal localization or spatio-temporal extension seems difficult. This fact is the main ground for the suggestion that can often be found in the physics literature on canonical quantum gravity according to which spacetime may not be fundamental in some sense. This paper aims to investigate this radical suggestion from an ontologically serious point of view in the cases of two standard forms of canonical quantum gravity, quantum geometrodynamics and loop quantum gravity. We start by discussing the physical features of the quantum wave functional of quantum geometrodynamics and of the spin networks (and spin foams) of loop quantum gravity that motivate the view according to which spacetime is not fundamental. We then point out that, by contrast, for any known ontologically serious understanding of quantum entanglement, the commitment to spacetime seems indispensable. Against this background, we then critically discuss the idea that spacetime may emerge from more fundamental entities. As a consequence, we finally suggest that the emergence of classical spacetime in canonical quantum gravity faces a dilemma: either spacetime ontologically emerges from more fundamental non-spatio-temporal entities or it already belongs to the fundamental quantum gravitational level and the emergence of the classical picture is merely a matter of levels of description. On the first horn of the dilemma, it is unclear how to make sense of concrete physical entities that are not in spacetime and of the notion of ontological emergence that is involved. The second horn runs into the difficulties raised by the physics of canonical quantum gravity.; The procedures of canonical quantization of the gravitational field apparently lead to entities for which any interpretation in terms of spatio-temporal localization or spatio-temporal extension seems difficult. This fact is the main ground for the suggestion that can often be found in the physics literature on canonical quantum gravity according to which spacetime may not be fundamental in some sense. This paper aims to investigate this radical suggestion from an ontologically serious point of view in the cases of two standard forms of canonical quantum gravity, quantum geometrodynamics and loop quantum gravity. We start by discussing the physical features of the quantum wave functional of quantum geometrodynamics and of the spin networks (and spin foams) of loop quantum gravity that motivate the view according to which spacetime is not fundamental. We then point out that, by contrast, for any known ontologically serious understanding of quantum entanglement, the commitment to spacetime seems indispensable. Against this background, we then critically discuss the idea that spacetime may emerge from more fundamental entities. As a consequence, we finally suggest that the emergence of classical spacetime in canonical quantum gravity faces a dilemma: either spacetime ontologically emerges from more fundamental non-spatio-temporal entities or it already belongs to the fundamental quantum gravitational level and the emergence of the classical picture is merely a matter of levels of description. On the first horn of the dilemma, it is unclear how to make sense of concrete physical entities that are not in spacetime and of the notion of ontological emergence that is involved. The second horn runs into the difficulties raised by the physics of canonical quantum gravity. © 2012 Elsevier Ltd.; The procedures of canonical quantization of the gravitational field apparently lead to entities for which any interpretation in terms of spatio-temporal localization or spatio-temporal extension seems difficult. This fact is the main ground for the suggestion that can often be found in the physics literature on canonical quantum gravity according to which spacetime may not be fundamental in some sense. This paper aims to investigate this radical suggestion from an ontologically serious point of view in the cases of two standard forms of canonical quantum gravity, quantum geometrodynamics and loop quantum gravity. We start by discussing the physical features of the quantum wave functional of quantum geometrodynamics and of the spin networks (and spin foams) of loop quantum gravity that motivate the view according to which spacetime is not fundamental. We then point out that, by contrast, for any known ontologically serious understanding of quantum entanglement, the commitment to spacetime seems indispensable. Against this background, we then critically discuss the idea that spacetime may emerge from more fundamental entities. As a consequence, we finally suggest that the emergence of classical spacetime in canonical quantum gravity faces a dilemma: either spacetime ontologically emerges from more fundamental non-spatio-temporal entities or it already belongs to the fundamental quantum gravitational level and the emergence of the classical picture is merely a matter of levels of description. On the first horn of the dilemma, it is unclear how to make sense of concrete physical entities that are not in spacetime and of the notion of ontological emergence that is involved. The second horn runs into the difficulties raised by the physics of canonical quantum gravity. (C) 2012 Elsevier Ltd. All rights reserved.
BibTeX:
@article{lam-dilemma-2013,
  author = {Lam, Vincent and Esfeld, Michael},
  title = {A dilemma for the emergence of spacetime in canonical quantum gravity},
  journal = {Studies in History and Philosophy of Science Part B - Studies in History and Philosophy of Modern Physics},
  year = {2013},
  volume = {44},
  number = {3},
  pages = {286--293}
}
Floridi, L. A Distributed Model of Truth for Semantic Information 2009   inproceedings  
BibTeX:
@inproceedings{floridi-distributed-2009,
  author = {Floridi, Luciano},
  title = {A Distributed Model of Truth for Semantic Information},
  year = {2009}
}
Crick, F. and Koch, C. A framework for consciousness 2003 Nat Neurosci
Vol. 6 
article DOI URL 
BibTeX:
@article{crick-framework-2003,
  author = {Crick, F. and Koch, C.},
  title = {A framework for consciousness},
  journal = {Nat Neurosci},
  year = {2003},
  volume = {6},
  url = {http://dx.doi.org/10.1038/nn0203-119},
  doi = {http://doi.org/10.1038/nn0203-119}
}
Ferrero, M., Gómez Pin, V., Salgado, D. and Sánchez-Gómez, J.L. A Further Review of the Incompatibility between Classical Principles and Quantum Postulates 2013 Foundations of Science
Vol. 18(1), pp. 125-138 
article  
Abstract: The traditional "realist" conception of physics, according to which human concepts, laws and theories can grasp the essence of a reality in our absence, seems incompatible with quantum formalism and it most fruitful interpretation. The proof rests on the violation by quantum mechanical formalism of some fundamental principles of the classical ontology. We discuss if the conception behind Einstein's idea of a reality in our absence, could be still maintained and at which price. We conclude that quantum mechanical formalism is not formulated on those terms, leaving for a separated paper the discussion about the terms in which it could be formulated and the onto-epistemological implications it might have.;The traditional “realist” conception of physics, according to which human concepts, laws and theories can grasp the essence of a reality in our absence, seems incompatible with quantum formalism and it most fruitful interpretation. The proof rests on the violation by quantum mechanical formalism of some fundamental principles of the classical ontology. We discuss if the conception behind Einstein’s idea of a reality in our absence, could be still maintained and at which price. We conclude that quantum mechanical formalism is not formulated on those terms, leaving for a separated paper the discussion about the terms in which it could be formulated and the onto-epistemological implications it might have.;The traditional "realist" conception of physics, according to which human concepts, laws and theories can grasp the essence of a reality in our absence, seems incompatible with quantum formalism and it most fruitful interpretation. The proof rests on the violation by quantum mechanical formalism of some fundamental principles of the classical ontology. We discuss if the conception behind Einstein's idea of a reality in our absence, could be still maintained and at which price. We conclude that quantum mechanical formalism is not formulated on those terms, leaving for a separated paper the discussion about the terms in which it could be formulated and the onto-epistemological implications it might have. Keywords Foundations of quantum physics * Philosophy of science * Non-contextuality * Individuation * Locality * Determinism * Correlated quantum systems with finite Hilbert space * Entanglement swapping;
BibTeX:
@article{ferrero-further-2013,
  author = {Ferrero, M. and Gómez Pin, V. and Salgado, D. and Sánchez-Gómez, J. L.},
  title = {A Further Review of the Incompatibility between Classical Principles and Quantum Postulates},
  journal = {Foundations of Science},
  year = {2013},
  volume = {18},
  number = {1},
  pages = {125--138}
}
Siefe, C. A general surrenders the field, but black hole battle rages on: Stephen Hawking may have changed his mind, but questions about the fate of information continue to expose fault lines between relativity and quantum theories 2004 Science
Vol. 305(5686), pp. 934 
article  
BibTeX:
@article{siefe-general-2004,
  author = {Siefe, Charles},
  title = {A general surrenders the field, but black hole battle rages on: Stephen Hawking may have changed his mind, but questions about the fate of information continue to expose fault lines between relativity and quantum theories},
  journal = {Science},
  year = {2004},
  volume = {305},
  number = {5686},
  pages = {934}
}
Breuer, T. A gödel-turing perspective on quantum states indistinguishable from inside 2012   incollection  
BibTeX:
@incollection{breuer-go-turing-2012,
  author = {Breuer, Thomas},
  title = {A gödel-turing perspective on quantum states indistinguishable from inside},
  year = {2012}
}
Cross, C.B. A Logical Transmission Principle for Conclusive Reasons 2015 Australasian Journal of Philosophy
Vol. 93(2), pp. 353-370 
article  
Abstract: Dretske's conclusive reasons account of knowledge is designed to explain how epistemic closure can fail when the evidence for a belief does not transmit to some of that belief's logical consequences. Critics of Dretske dispute the argument against closure while joining Dretske in writing off transmission. This paper shows that, in the most widely accepted system for counterfactual logic (David Lewis's system VC), conclusive reasons are governed by an informative, non-trivial, logical transmission principle. If r is a conclusive reason for believing p in Dretske's sense, and if p logically implies q, and if p and q satisfy one additional condition, it follows that r is a conclusive reason for believing q. After introducing this additional condition, I explain its intuitive import and use the condition to shed new light on Dretske's response to scepticism, as well as on his distinction between the so-called 'lightweight' and 'heavyweight' implications of a piece of perceptual knowledge.;Dretske's conclusive reasons account of knowledge is designed to explain how epistemic closure can fail when the evidence for a belief does not transmit to some of that belief's logical consequences. Critics of Dretske dispute the argument against closure while joining Dretske in writing off transmission. This paper shows that, in the most widely accepted system for counterfactual logic (David Lewis's system VC), conclusive reasons are governed by an informative, non-trivial, logical transmission principle. If r is a conclusive reason for believing p in Dretske's sense, and if p logically implies q, and if p and q satisfy one additional condition, it follows that r is a conclusive reason for believing q. After introducing this additional condition, I explain its intuitive import and use the condition to shed new light on Dretske's response to scepticism, as well as on his distinction between the so-called 'lightweight' and 'heavyweight' implications of a piece of perceptual knowledge.; Dretske's conclusive reasons account of knowledge is designed to explain how epistemic closure can fail when the evidence for a belief does not transmit to some of that belief's logical consequences. Critics of Dretske dispute the argument against closure while joining Dretske in writing off transmission. This paper shows that, in the most widely accepted system for counterfactual logic (David Lewis's system VC), conclusive reasons are governed by an informative, non-trivial, logical transmission principle. If r is a conclusive reason for believing p in Dretske's sense, and if p logically implies q, and if p and q satisfy one additional condition, it follows that r is a conclusive reason for believing q. After introducing this additional condition, I explain its intuitive import and use the condition to shed new light on Dretske's response to scepticism, as well as on his distinction between the so-called 'lightweight' and 'heavyweight' implications of a piece of perceptual knowledge.;
BibTeX:
@article{cross-logical-2015,
  author = {Cross, Charles B.},
  title = {A Logical Transmission Principle for Conclusive Reasons},
  journal = {Australasian Journal of Philosophy},
  year = {2015},
  volume = {93},
  number = {2},
  pages = {353--370}
}
Shannon, C.E. A Mathematical Theory of Communication 2001 SIGMOBILE Mob. Comput. Commun. Rev.
Vol. 5(1), pp. 3-55 
article DOI URL 
BibTeX:
@article{shannon-mathematical-2001,
  author = {Shannon, C. E.},
  title = {A Mathematical Theory of Communication},
  journal = {SIGMOBILE Mob. Comput. Commun. Rev.},
  year = {2001},
  volume = {5},
  number = {1},
  pages = {3--55},
  url = {http://doi.acm.org.ezproxy1.library.usyd.edu.au/10.1145/584091.584093},
  doi = {http://doi.org/10.1145/584091.584093}
}
Klüver, J. A mathematical theory of communication: Meaning, information, and topology 2011 Complexity
Vol. 16(3), pp. 10-26 
article  
Abstract: This article proposes a new mathematical theory of communication. The basic concepts of meaning and information are defined in terms of complex systems theory. Meaning of a message is defined as the attractor it generates in the receiving system; information is defined as the difference between a vector of expectation and one of perception. It can be sown that both concepts are determined by the topology of the receiving system. © 2010 Wiley Periodicals, Inc. Complexity 16: 10–26, 2011
BibTeX:
@article{kluver-mathematical-2011,
  author = {Klüver, Jürgen},
  title = {A mathematical theory of communication: Meaning, information, and topology},
  journal = {Complexity},
  year = {2011},
  volume = {16},
  number = {3},
  pages = {10--26}
}
Shannon, Claude E. A Mathematical theory of Communication: Reprinted with corrections 1998 (50th anniversary release of 1948 paper)). 1998 The Bell System Technical Journal  article URL 
BibTeX:
@article{shannon-claude-e.-mathematical-1998,
  author = {Shannon, Claude E.},
  title = {A Mathematical theory of Communication: Reprinted with corrections 1998 (50th anniversary release of 1948 paper)).},
  journal = {The Bell System Technical Journal},
  year = {1998},
  url = {http://cm.bell-labs.com/cS/ms/what/shannonday/paper.html}
}
Copley, S.D., Smith, E., Morowitz, H.J. and Petsko, G.A. A Mechanism for the Association of Amino Acids with Their Codons and the Origin of the Genetic Code 2005 Proceedings of the National Academy of Sciences of the United States of America
Vol. 102(12), pp. 4442-4447 
article  
Abstract: The genetic code has certain regularities that have resisted mechanistic interpretation. These include strong correlations between the first base of codons and the precursor from which the encoded amino acid is synthesized and between the second base of codons and the hydrophobicity of the encoded amino acid. These regularities are even more striking in a projection of the modern code onto a simpler code consisting of doublet codons encoding a set of simple amino acids. These regularities can be explained if, before the emergence of macromolecules, simple amino acids were synthesized in covalent complexes of dinucleotides with α-keto acids originating from the reductive tricarboxylic acid cycle or reductive acetate pathway. The bases and phosphates of the dinucleotide are proposed to have enhanced the rates of synthetic reactions leading to amino acids in a small-molecule reaction network that preceded the RNA translation apparatus but created an association between amino acids and the first two bases of their codons that was retained when translation emerged later in evolution.; The genetic code has certain regularities that have resisted mechanistic interpretation. These include strong correlations between the first base of codons and the precursor from which the encoded amino acid is synthesized and between the second base of codons and the hydrophobicity of the encoded amino acid. These regularities are even more striking in a projection of the modern code onto a simpler code consisting of doublet codons encoding a set of simple amino acids. These regularities can be explained if, before the emergence of macromolecules, simple amino acids were synthesized in covalent complexes of dinucleotides with alpha-keto acids originating from the reductive tricarboxylic acid cycle or reductive acetate pathway. The bases and phosphates of the dinucleotide are proposed to have enhanced the rates of synthetic reactions leading to amino acids in a small-molecule reaction network that preceded the RNA translation apparatus but created an association between amino acids and the first two bases of their codons that was retained when translation emerged later in evolution. [PUBLICATION ABSTRACT]; The genetic code has certain regularities that have resisted mechanistic interpretation. These include strong correlations between the first base of codons and the precursor from which the encoded amino acid is synthesized and between the second base of codons and the hydrophobicity of the encoded amino acid. These regularities are even more striking in a projection of the modern code onto a simpler code consisting of doublet codons encoding a set of simple amino acids. These regularities can be explained if, before the emergence of macromolecules, simple amino acids were synthesized in covalent complexes of dinucleotides with alpha-keto acids originating from the reductive tricarboxylic acid cycle or reductive acetate pathway. The bases and phosphates of the dinucleotide are proposed to have enhanced the rates of synthetic reactions leading to amino acids in a small-molecule reaction network that preceded the RNA translation apparatus but created an association between amino acids and the first two bases of their codons that was retained when translation emerged later in evolution.; The genetic code has certain regularities that have resisted mechanistic interpretation. These include strong correlations between the first base of codons and the precursor from which the encoded amino acid is synthesized and between the second base of codons and the hydrophobicity of the encoded amino acid. These regularities are even more striking in a projection of the modern code onto a simpler code consisting of doublet codons encoding a set of simple amino acids. These regularities can be explained if, before the emergence of macromolecules, simple amino acids were synthesized in covalent complexes of dinucleotides with alpha-keto acids originating from the reductive tricarboxylic acid cycle or reductive acetate pathway. The bases and phosphates of the dinucleotide are proposed to have enhanced the rates of synthetic reactions leading to amino acids in a small-molecule reaction network that preceded the RNA translation apparatus but created an association between amino acids and the first two bases of their codons that was retained when translation emerged later in evolution.; The genetic code has certain regularities that have resisted mechanistic interpretation. These include strong correlations between the first base of codons and the precursor from which the encoded amino acid is synthesized and between the second base of codons and the hydrophobicity of the encoded amino acid. These regularities are even more striking in a projection of the modern code onto a simpler code consisting of doublet codons encoding a set of simple amino acids. These regularities can be explained if, before the emergence of macromolecules, simple amino acids were synthesized in covalent complexes of dinucleotides with α-keto acids originating from the reductive tricarboxylic acid cycle or reductive acetate pathway. The bases and phosphates of the dinucleotide are proposed to have enhanced the rates of synthetic reactions leading to amino acids in a small-molecule reaction network that preceded the RNA translation apparatus but created an association between amino acids and the first two bases of their codons that was retained when translation emerged later in evolution. catalysis origin of life; The genetic code has certain regularities that have resisted mechanistic interpretation. These include strong correlations between the first base of codons and the precursor from which the encoded amino acid is synthesized and between the second base of codons and the hydrophobicity of the encoded amino acid. These regularities are even more striking in a projection of the modern code onto a simpler code consisting of doublet codons encoding a set of simple amino acids. These regularities can be explained if, before the emergence of macromolecules, simple amino acids were synthesized in covalent complexes of dinucleotides with [alpha]-keto acids originating from the reductive tricarboxylic acid cycle or reductive acetate pathway. The bases and phosphates of the dinucleotide are proposed to have enhanced the rates of synthetic reactions leading to amino acids in a small-molecule reaction network that preceded the RNA translation apparatus but created an association between amino acids and the first two bases of their codons that was retained when translation emerged later in evolution.; The genetic code has certain regularities that have resisted mechanistic interpretation. These include strong correlations between the first base of codons and the precursor from which the encoded amino acid is synthesized and between the second base of codons and the hydrophobicity of the encoded amino acid. These regularities are even more striking in a projection of the modern code onto a simpler code consisting of doublet codons encoding a set of simple amino acids. These regularities can be explained if, before the emergence of macromolecules, simple amino acids were synthesized in covalent complexes of dinucleotides with α-keto acids originating from the reductive tricarboxylic acid cycle or reductive acetate pathway. The bases and phosphates of the dinucleotide are proposed to have enhanced the rates of synthetic reactions leading to amino acids in a small-molecule reaction network that preceded the RNA translation apparatus but created an association between amino acids and the first two bases of their codons that was retained when translation emerged later in evolution.
BibTeX:
@article{copley-mechanism-2005,
  author = {Copley, Shelley D. and Smith, Eric and Morowitz, Harold J. and Petsko, Gregory A.},
  title = {A Mechanism for the Association of Amino Acids with Their Codons and the Origin of the Genetic Code},
  journal = {Proceedings of the National Academy of Sciences of the United States of America},
  year = {2005},
  volume = {102},
  number = {12},
  pages = {4442--4447}
}
Votsis, I. A Metaphysics for Scientific Realism 2009
Vol. 69(2) 
book  
BibTeX:
@book{votsis-metaphysics-2009-1,
  author = {Votsis, Ioannis},
  title = {A Metaphysics for Scientific Realism},
  year = {2009},
  volume = {69},
  number = {2}
}
Chakravartty, A. A metaphysics for scientific realism: knowing the unobservable 2007   book  
BibTeX:
@book{chakravartty-metaphysics-2007,
  author = {Chakravartty, Anjan},
  title = {A metaphysics for scientific realism: knowing the unobservable},
  publisher = {Cambridge University Press},
  year = {2007}
}
Votsis, I. A Metaphysics for Scientific RealismBy A. Chakravartty 2009 Analysis
Vol. 69(2), pp. 378-380 
article  
BibTeX:
@article{votsis-metaphysics-2009,
  author = {Votsis, Ioannis},
  title = {A Metaphysics for Scientific RealismBy A. Chakravartty},
  journal = {Analysis},
  year = {2009},
  volume = {69},
  number = {2},
  pages = {378--380}
}
Lee, A., Streinu, I. and Brock, O. A methodology for efficiently sampling the conformation space of molecular structures 2005 Physical Biology
Vol. 2(4), pp. S108-S115 
article  
Abstract: Motivated by recently developed computational techniques for studying protein flexibility, and their potential applications in docking, we propose an efficient method for sampling the conformational space of complex molecular structures. We focus on the loop closure problem, identified in the work of Thorpe and Lei (2004 Phil. Mag. 84 1323-31) as a primary bottleneck in the fast simulation of molecular motions. By modeling a molecular structure as a branching robot, we use an intuitive method in which the robot holds onto itself for maintaining loop constraints. New conformations are generated by applying random external forces, while internal, attractive forces pull the loops closed. Our implementation, tested on several model molecules with low number of degrees of freedom but many interconnected loops, gives promising results that show an almost four times speed-up on the benchmark cube-molecule of Thorpe and Lei.; Motivated by recently developed computational techniques for studying protein flexibility, and their potential applications in docking, we propose an efficient method for sampling the conformational space of complex molecular structures. We focus on the loop closure problem, identified in the work of Thorpe and Lei (2004 Phil. Mag. 84 1323-31) as a primary bottleneck in the fast simulation of molecular motions. By modeling a molecular structure as a branching robot, we use an intuitive method in which the robot holds onto itself for maintaining loop constraints. New conformations are generated by applying random external forces, while internal, attractive forces pull the loops closed. Our implementation, tested on several model molecules with low number of degrees of freedom but many interconnected loops, gives promising results that show an almost four times speed-up on the benchmark cube-molecule of Thorpe and Lei.
BibTeX:
@article{lee-methodology-2005,
  author = {Lee, Audrey and Streinu, Ileana and Brock, Oliver},
  title = {A methodology for efficiently sampling the conformation space of molecular structures},
  journal = {Physical Biology},
  year = {2005},
  volume = {2},
  number = {4},
  pages = {S108--S115}
}
Lumer, E.D. A neural model of binocular integration and rivalry based on the coordination of action-potential timing in primary visual cortex 1998 Cereb Cortex
Vol. 8 
article DOI URL 
BibTeX:
@article{lumer-neural-1998,
  author = {Lumer, E. D.},
  title = {A neural model of binocular integration and rivalry based on the coordination of action-potential timing in primary visual cortex},
  journal = {Cereb Cortex},
  year = {1998},
  volume = {8},
  url = {http://dx.doi.org/10.1093/cercor/8.6.553},
  doi = {http://doi.org/10.1093/cercor/8.6.553}
}
Dehaene, S., Sergent, C. and Changeux, J.P. A neuronal network model linking subjective reports and objective physiological data during conscious perception 2003 Proc Natl Acad Sci U S A
Vol. 100 
article DOI URL 
BibTeX:
@article{dehaene-neuronal-2003,
  author = {Dehaene, S. and Sergent, C. and Changeux, J. P.},
  title = {A neuronal network model linking subjective reports and objective physiological data during conscious perception},
  journal = {Proc Natl Acad Sci U S A},
  year = {2003},
  volume = {100},
  url = {http://dx.doi.org/10.1073/pnas.1332574100},
  doi = {http://doi.org/10.1073/pnas.1332574100}
}
Fredkin, E. A New Cosmogony 1992 Physics and Computation, 1992. PhysComp '92., Workshop on, pp. 116-121  inproceedings DOI  
BibTeX:
@inproceedings{fredkin-new-1992,
  author = {Fredkin, E.},
  title = {A New Cosmogony},
  booktitle = {Physics and Computation, 1992. PhysComp '92., Workshop on},
  year = {1992},
  pages = {116--121},
  doi = {http://doi.org/10.1109/PHYCMP.1992.615507}
}
Rickles, D.P. A new spin on the hole argument 2005 Studies in History and Philosophy of Modern Physics
Vol. 36(3), pp. 415-434 
article  
Abstract: This paper shows how an exact analog of the hole argument can be constructed in the loop representation of quantum gravity. The new argument is based on the embedding of spin-networks in a manifold along with the action of the diffeomorphism constraint on them. I will then briefly discuss the implications of this result vis-à-vis the ontological debate about the nature of space. I argue that the conclusions of many physicists working on loop quantum gravity (e.g., Rovelli and Smolin) that the loop representation uniquely supports relationalism are unfounded, as is the claim of Belot and Earman that quantum gravity might serve to settle the debate one way or the other. © 2005 Elsevier Ltd. All rights reserved.; This paper shows how an exact analog of the hole argument can be constructed in the loop representation of quantum gravity. The new argument is based on the embedding of spin-networks in a manifold along with the action of the diffeomorphism constraint on them. I will then briefly discuss the implications of this result vis-a-vis the ontological debate about the nature of space. I argue that the conclusions of many physicists working on loop quantum gravity (e.g., Rovelli and Smolin) that the loop representation uniquely supports relationalism are unfounded, as is the claim or Belot and Earman that quantum gravity might serve to settle the debate one way or the other. (c) 2005 Elsevier Ltd. All rights reserved.
BibTeX:
@article{rickles-new-2005,
  author = {Rickles, D. P.},
  title = {A new spin on the hole argument},
  journal = {Studies in History and Philosophy of Modern Physics},
  year = {2005},
  volume = {36},
  number = {3},
  pages = {415--434}
}
Bostrom, N. and Kulczycki, M. A patch for the Simulation Argument 2011 Analysis
Vol. 71(1), pp. 54-61 
article  
BibTeX:
@article{bostrom-patch-2011,
  author = {Bostrom, Nick and Kulczycki, Marcin},
  title = {A patch for the Simulation Argument},
  journal = {Analysis},
  year = {2011},
  volume = {71},
  number = {1},
  pages = {54--61}
}
Summers, S.J. A Perspective on Constructive Quantum Field Theory 2012   article  
Abstract: An overview of the accomplishments of constructive quantum field theory is provided.
BibTeX:
@article{summers-perspective-2012,
  author = {Summers, Stephen J.},
  title = {A Perspective on Constructive Quantum Field Theory},
  year = {2012}
}
Swanson, N. A philosopher's guide to the foundations of quantum field theory 2017 Philosophy Compass
Vol. 12(5) 
article  
Abstract: A major obstacle facing interpreters of quantum field theory (QFT) is a proliferation of different theoretical frameworks. This article surveys three of the main available options–Lagrangian, Wightman, and algebraic QFT–and examines how they are related. Although each framework emphasizes different aspects of QFT, leading to distinct strengths and weaknesses, there is less tension between them than commonly assumed. Given the limitations of our current knowledge and the need for creative new ideas, I urge philosophers to explore puzzles, tools, and techniques from all three approaches.;A major obstacle facing interpreters of quantum field theory (QFT) is a proliferation of different theoretical frameworks. This article surveys three of the main available options—Lagrangian, Wightman, and algebraic QFT—and examines how they are related. Although each framework emphasizes different aspects of QFT, leading to distinct strengths and weaknesses, there is less tension between them than commonly assumed. Given the limitations of our current knowledge and the need for creative new ideas, I urge philosophers to explore puzzles, tools, and techniques from all three approaches.;
BibTeX:
@article{swanson-philosophers-2017,
  author = {Swanson, Noel},
  title = {A philosopher's guide to the foundations of quantum field theory},
  journal = {Philosophy Compass},
  year = {2017},
  volume = {12},
  number = {5}
}
Sequoiah-Grayson, S. A Positive Information Logic for Inferential Information 2009 Synthese
Vol. 167(2), pp. 409-431 
article  
Abstract: Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.; Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.; Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.; Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.; Performing an inference involves irreducibly dynamic cognitive procedures. The article proposes that a non-associative information frame, corresponding to a residuated pogroupoid, underpins the information structure involved. The argument proceeds by expounding the informational turn in logic, before outlining the cognitive actions at work in deductive inference. The structural rules of Weakening, Contraction, Commutation, and Association are rejected on the grounds that they cause us to lose track of the information flow in inferential procedures. By taking the operation of information application as the primary operation, the fusion connective is retained, with commutative failure generating a double implication. The other connectives are rejected.
BibTeX:
@article{sequoiah-grayson-positive-2009,
  author = {Sequoiah-Grayson, Sebastian},
  title = {A Positive Information Logic for Inferential Information},
  journal = {Synthese},
  year = {2009},
  volume = {167},
  number = {2},
  pages = {409--431}
}
Taddeo, M. and Floridi, L. A Praxical Solution of the Symbol Grounding Problem 2007 Minds and Machines
Vol. 17(4), pp. 369-389 
article  
Abstract: This article is the second step in our research into the Symbol Grounding Problem (SGP). In a previous work, we defined the main condition that must be satisfied by any strategy in order to provide a valid solution to the SGP, namely the zero semantic commitment condition (Z condition). We then showed that all the main strategies proposed so far fail to satisfy the Z condition, although they provide several important lessons to be followed by any new proposal. Here, we develop a new solution of the SGP. It is called praxical in order to stress the key role played by the interactions between the agents and their environment. It is based on a new theory of meaning—Action-based Semantics (AbS)—and on a new kind of artificial agents, called two-machine artificial agents (AM²). Thanks to their architecture, AM2s implement AbS, and this allows them to ground their symbols semantically and to develop some fairly advanced semantic abilities, including the development of semantically grounded communication and the elaboration of representations, while still respecting the Z condition.;This article is the second step in our research into the Symbol Grounding Problem (SGP). In a previous work, we defined the main condition that must be satisfied by any strategy in order to provide a valid solution to the SGP, namely the zero semantic commitment condition (Z condition). We then showed that all the main strategies proposed so far fail to satisfy the Z condition, although they provide several important lessons to be followed by any new proposal. Here, we develop a new solution of the SGP. It is called praxical in order to stress the key role played by the interactions between the agents and their environment. It is based on a new theory of meaning–Action-based Semantics (AbS)–and on a new kind of artificial agents, called two-machine artificial agents (AMA2). Thanks to their architecture, A[M.sup.2]s implement AbS, and this allows them to ground their symbols semantically and to develop some fairly advanced semantic abilities, including the development of semantically grounded communication and the elaboration of representations, while still respecting the Z condition.;
BibTeX:
@article{taddeo-praxical-2007,
  author = {Taddeo, Mariarosaria and Floridi, Luciano},
  title = {A Praxical Solution of the Symbol Grounding Problem},
  journal = {Minds and Machines},
  year = {2007},
  volume = {17},
  number = {4},
  pages = {369--389}
}
Liu, H., Luo, L. and Fan, P. A Pre-Equalized Transmission Based on Basefield Hartley Transform over Multi-Path Fading Channels 2011 , pp. 1-6  inproceedings  
Abstract: This paper proposes a pre-equalized transmission scheme based on a finite field transform. Unlike orthogonal frequency division multiplexing (OFDM) which transmits signals on orthogonal sub- carriers at the transmitter and explores post-equalization to migrate the effect of multi-path fading at the receiver, in the proposed scheme, signals are processed with the finite field transform called basefield Hartley transform (BHT) before transmission. Since BHT has similar convolution property as discrete Fourier transform (DFT), the inter-symbol interference (ISI) can be mitigated by applying a finite field pre-equalizer and quantization pre-equalizer at the transmitter. By allowing the padded redundancy to take values in finite field, the proposed scheme has the inherent error correction capability with various coding rate. Since the received signals are equivalent to certain block error correction codes, only a decoding scheme is required to recover the source data without any DFT operations that are used in conventional OFDM receiver, which leads to a simple receiving structure with low process complexity. Our simulation results show that the proposed scheme reduces the peak-to-average power ratio (PAPR) and provides better bit error rate (BER) performance at the high signal-to-noise ratio (SNR) in comparison with the coded OFDM system.
BibTeX:
@inproceedings{liu-pre-equalized-2011,
  author = {Liu, Heng and Luo, Lin and Fan, Pingzhi},
  title = {A Pre-Equalized Transmission Based on Basefield Hartley Transform over Multi-Path Fading Channels},
  year = {2011},
  pages = {1--6}
}
Steane, A.M. A quantum computer only needs one universe 2003 Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Vol. 34(3), pp. 469 - 478 
article DOI  
Abstract: The nature of quantum computation is discussed. It is argued that, in terms of the amount of information manipulated in a given time, quantum and classical computation are equally efficient. Quantum superposition does not permit quantum computers to “perform many computations simultaneously” except in a highly qualified and to some extent misleading sense. Quantum computation is therefore not well described by interpretations of quantum mechanics which invoke the concept of vast numbers of parallel universes. Rather, entanglement makes available types of computation processes which, while not exponentially larger than classical ones, are unavailable to classical systems. The essence of quantum computation is that it uses entanglement to generate and manipulate a physical representation of the correlations between logical entities, without the need to completely represent the logical entities themselves.
BibTeX:
@article{steane-quantum-2003,
  author = {Steane, A. M.},
  title = {A quantum computer only needs one universe},
  journal = {Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics},
  year = {2003},
  volume = {34},
  number = {3},
  pages = {469 -- 478},
  doi = {http://doi.org/10.1016/S1355-2198(03)00038-8}
}
Bennett, C.H. A Resource-based View of Quantum Information 2004 Quantum Info. Comput.
Vol. 4(6), pp. 460-466 
article URL 
BibTeX:
@article{bennett-resource-based-2004,
  author = {Bennett, Charles H.},
  title = {A Resource-based View of Quantum Information},
  journal = {Quantum Info. Comput.},
  year = {2004},
  volume = {4},
  number = {6},
  pages = {460--466},
  url = {http://dl.acm.org/citation.cfm?id=2011593.2011598}
}
Ubriaco, M.R. A simple mathematical model for anomalous diffusion via Fisher's information theory 2009 Physics Letters A
Vol. 373(44), pp. 4017-4021 
article  
Abstract: Starting with the relative entropy based on a previously proposed entropy function S.sub.q[p]=a'dxp(x)x(-lnp(x)).sup.q, we find the corresponding Fisher's information measure. After function redefinition we then maximize the Fisher information measure with respect to the new function and obtain a differential operator that reduces to a space coordinate second derivative in the q[right arrow]1 limit. We then propose a simple differential equation for anomalous diffusion and show that its solutions are a generalization of the functions in the Barenblatt-Pattle solution. We find that the mean squared displacement, up to a q-dependent constant, has a time dependence according to (x.sup.2)[approximately equal to]K.sup.1/qt.sup.1/q, where the parameter q takes values q=2n-1/2n+1 (superdiffusion) and q=2n+1/2n-1 (subdiffusion), an[greater than or equal to]1.;Starting with the relative entropy based on a previously proposed entropy function Sq[p]=ınt dx p(x)(-\ln p(x))textasciicircumq, we find the corresponding Fisher's information measure. After function redefinition we then maximize the Fisher information measure with respect to the new function and obtain a differential operator that reduces to a space coordinate second derivative in the q\to 1 limit. We then propose a simple differential equation for anomalous diffusion and show that its solutions are a generalization of the functions in the Barenblatt-Pattle solution. We find that the mean squared displacement, up to a q-dependent constant, has a time dependence according to textlessxtextasciicircum2textgreater\sim Ktextasciicircum1/qttextasciicircum1/q, where the parameter q takes values q=\frac2n-12n+1 (superdiffusion) and q=\frac2n+12n-1 (subdiffusion), \forall n\geq 1.;Starting with the relative entropy based on a previously proposed entropy function S [p] = ∫ d x p (x) × (- ln p (x)) , we find the corresponding Fisher's information measure. After function redefinition we then maximize the Fisher information measure with respect to the new function and obtain a differential operator that reduces to a space coordinate second derivative in the q → 1 limit. We then propose a simple differential equation for anomalous diffusion and show that its solutions are a generalization of the functions in the Barenblatt-Pattle solution. We find that the mean squared displacement, up to a q-dependent constant, has a time dependence according to 〈 x 〉 ∼ K t , where the parameter q takes values q = frac(2 n - 1, 2 n + 1) (superdiffusion) and q = frac(2 n + 1, 2 n - 1) (subdiffusion), ∀ n ≥ 1. © 2009 Elsevier B.V. All rights reserved.;Starting with the relative entropy based on a previously proposed entropy function S-q vertical bar P vertical bar = integral dx p(x) x (-In p(x))(q), we find the corresponding Fisher's information measure. After function redefinition we then maximize the Fisher information measure with respect to the new function and obtain a differential operator that reduces to a space coordinate second derivative in the q 1 limit. We then propose a simple differential equation for anomalous diffusion and show that its solutions are a generalization or the functions in the Barenblatt-Pattie solution. We find that the mean squared displacement, Lip to a q-dependent constant, has a time dependence according to textless x(2)textgreater similar to K(1/q)t(1/q), where the parameter q takes values q = 2n-1/sn+1 (superdiffusion) and q = 2n+1/2n-1 (subdiffusion), for all n textgreater= 1. (C) 2009 Elsevier B.V. All rights reserved.;
BibTeX:
@article{ubriaco-simple-2009,
  author = {Ubriaco, Marcelo R.},
  title = {A simple mathematical model for anomalous diffusion via Fisher's information theory},
  journal = {Physics Letters A},
  year = {2009},
  volume = {373},
  number = {44},
  pages = {4017--4021}
}
Burgess, J.P. and Rosen, G.A. A subject with no object: strategies for nominalistic interpretation of mathematics 1997   book  
Abstract: Numbers and other mathematical objects are exceptional in having no locations in space and time and no causes or effects in the physical world. This makes it difficult to account for the possibility of mathematical knowledge, leading many philosophers to embrace nominalism, the doctrine that there are no abstract entities. It has also led some of them to embark on ambitious projects for interpreting mathematics so as to preserve the subject while eliminating its objects, eliminating so‐called ontological commitment to numbers, sets, and the like. These projects differ considerably in the apparatus they employ, and the spirit in which they are put forward. Some employ synthetic geometry, others modal logic. Some are put forward as revolutionary replacements for existing mathematics and science, others hermeneutic hypotheses about what they have meant all along. We attempt to cut through technicalities that have obscured previous discussions of these projects, and to present concise accounts with minimal prerequisites of a dozen strategies for nominalistic interpretation of mathematics. We also examine critically the aims and claims of such interpretations, suggesting that what they really achieve is something quite different from what the authors of such projects usually assume.
BibTeX:
@book{burgess-subject-1997,
  author = {Burgess, John P. and Rosen, Gideon A.},
  title = {A subject with no object: strategies for nominalistic interpretation of mathematics},
  publisher = {Oxford University Press},
  year = {1997}
}
Fligstein, N. and McAdam, D. A theory of fields 2012   book  
BibTeX:
@book{fligstein-theory-2012,
  author = {Fligstein, Neil and McAdam, Doug},
  title = {A theory of fields},
  publisher = {Oxford University Press},
  year = {2012}
}
Balaguer, M. A Theory of Mathematical Correctness and Mathematical Truth 2001 Pacific Philosophical Quarterly
Vol. 82(2), pp. 87-114 
article  
Abstract: A theory of objective mathematical correctness is developed. The theory is consistent with both mathematical realism and mathematical antirealism, and versions of realism and anti-realism are developed that dovetail with the theory of correctness.
BibTeX:
@article{balaguer-theory-2001,
  author = {Balaguer, Mark},
  title = {A Theory of Mathematical Correctness and Mathematical Truth},
  journal = {Pacific Philosophical Quarterly},
  year = {2001},
  volume = {82},
  number = {2},
  pages = {87--114}
}
Chaitin, G. A Theory of Program Size Formally Identical to Information Theory 1975 Journal of the ACM (JACM)
Vol. 22(3), pp. 329-340 
article  
BibTeX:
@article{chaitin-theory-1975,
  author = {Chaitin, Gregory},
  title = {A Theory of Program Size Formally Identical to Information Theory},
  journal = {Journal of the ACM (JACM)},
  year = {1975},
  volume = {22},
  number = {3},
  pages = {329--340}
}
Baron, S. A Truthmaker Indispensability Argument 2013 Synthese
Vol. 190(12), pp. 2413-2427 
article  
Abstract: Recently, nominalists have made a case against the Quine-Putnam indispensability argument for mathematical Platonism by taking issue with Quine's criterion of ontological commitment. In this paper I propose and defend an indispensability argument founded on an alternative criterion of ontological commitment: that advocated by David Armstrong. By defending such an argument I place the burden back onto the nominalist to defend her favourite criterion of ontological commitment and, furthermore, show that criterion cannot be used to formulate a plausible form of the indispensability argument.[PUBLICATION ABSTRACT]; Recently, nominalists have made a case against the Quine–Putnam indispensability argument for mathematical Platonism by taking issue with Quine’s criterion of ontological commitment. In this paper I propose and defend an indispensability argument founded on an alternative criterion of ontological commitment: that advocated by David Armstrong. By defending such an argument I place the burden back onto the nominalist to defend her favourite criterion of ontological commitment and, furthermore, show that criterion cannot be used to formulate a plausible form of the indispensability argument.; Recently, nominalists have made a case against the Quine–Putnam indispensability argument for mathematical Platonism by taking issue with Quine's criterion of ontological commitment. In this paper I propose and defend an indispensability argument founded on an alternative criterion of ontological commitment: that advocated by David Armstrong. By defending such an argument I place the burden back onto the nominalist to defend her favourite criterion of ontological commitment and, furthermore, show that criterion cannot be used to formulate a plausible form of the indispensability argument.; Recently, nominalists have made a case against the Quine-Putnam indispensability argument for mathematical Platonism by taking issue with Quine's criterion of ontological commitment. In this paper I propose and defend an indispensability argument founded on an alternative criterion of ontological commitment: that advocated by David Armstrong. By defending such an argument I place the burden back onto the nominalist to defend her favourite criterion of ontological commitment and, furthermore, show that criterion cannot be used to formulate a plausible form of the indispensability argument.
BibTeX:
@article{baron-truthmaker-2013,
  author = {Baron, Sam},
  title = {A Truthmaker Indispensability Argument},
  journal = {Synthese},
  year = {2013},
  volume = {190},
  number = {12},
  pages = {2413--2427}
}
Edelman, G.M. and Tononi, G. A universe of consciousness: how matter becomes imagination 2000   book  
BibTeX:
@book{edelman-universe-2000,
  author = {Edelman, G. M. and Tononi, G.},
  title = {A universe of consciousness: how matter becomes imagination},
  publisher = {Basic Books},
  year = {2000}
}
Zeki, S. A Vision of the Brain. 1993   book URL 
BibTeX:
@book{zeki-vision-1993,
  author = {Zeki, S.},
  title = {A Vision of the Brain.},
  publisher = {Blackwell Scientific Publications},
  year = {1993},
  url = {http://www.vislab.ucl.ac.uk/avotb.php}
}
Armstrong, D.M. A world of states of affairs 1997   book  
BibTeX:
@book{armstrong-world-1997,
  author = {Armstrong, D. M.},
  title = {A world of states of affairs},
  publisher = {Cambridge University Press},
  year = {1997}
}
Rynasiewicz, R. Absolute Versus Relational Space-Time: An Outmoded Debate? 1996 The Journal of Philosophy
Vol. 93(6), pp. 279-306 
article  
Abstract: The metaphysics of space and time and why one should be skeptical that there is any such natural or preferred projection are examined. The fundamental fallacy in the debate is the supposition that the space-time theory can be stably formulated in terms that transcend any historical or conceptual context.
BibTeX:
@article{rynasiewicz-absolute-1996,
  author = {Rynasiewicz, Robert},
  title = {Absolute Versus Relational Space-Time: An Outmoded Debate?},
  journal = {The Journal of Philosophy},
  year = {1996},
  volume = {93},
  number = {6},
  pages = {279--306}
}
Hoefer, C. Absolute versus relational spacetime: For better or worse, the debate goes on 1998 The British Journal for the Philosophy of Science
Vol. 49(3), pp. 451 
article  
Abstract: The traditional absolutist-relationist debate is still clearly formulable in the context of General Relativity Theory (GTR), despite the important differences between Einstein's theory and the earlier context of Newtonian physics. This paper answers recent arguments by Robert Rynasiewicz against the significance of the debate in the GTR context.
BibTeX:
@article{hoefer-absolute-1998,
  author = {Hoefer, Carl},
  title = {Absolute versus relational spacetime: For better or worse, the debate goes on},
  journal = {The British Journal for the Philosophy of Science},
  year = {1998},
  volume = {49},
  number = {3},
  pages = {451}
}
Blass, A. and Gurevich, Y. Abstract Hilbertian deductive systems, infon logic, and Datalog 2013 INFORMATION AND COMPUTATION
Vol. 231, pp. 21-37 
article  
Abstract: In the first part of the paper, we discuss abstract Hilbertian deductive systems; these are systems defined by abstract notions of formula, axiom, and inference rule. We use these systems to develop a general method for converting derivability problems, from a broad range of deductive systems, into the derivability problem in a quite specific system, namely the Datalog fragment of universal Horn logic. In this generality, the derivability problems may not be recursively solvable, let alone feasible; in particular, we may get Datalog "programs" with infinitely many rules. We then discuss what would be needed to obtain computationally useful results from this method. In the second part of the paper, we analyze a particular deductive system, primal infon logic with variables, which arose in the development of the authorization language DKAL. A consequence of our analysis of primal infon logic with variables is that its derivability problems can be translated into Datalog with only a quadratic increase of size. (C) 2013 Elsevier Inc. All rights reserved.; In the first part of the paper, we discuss abstract Hilbertian deductive systems; these are systems defined by abstract notions of formula, axiom, and inference rule. We use these systems to develop a general method for converting derivability problems, from a broad range of deductive systems, into the derivability problem in a quite specific system, namely the Datalog fragment of universal Horn logic. In this generality, the derivability problems may not be recursively solvable, let alone feasible; in particular, we may get Datalog "programs" with infinitely many rules. We then discuss what would be needed to obtain computationally useful results from this method. In the second part of the paper, we analyze a particular deductive system, primal infon logic with variables, which arose in the development of the authorization language DKAL. A consequence of our analysis of primal infon logic with variables is that its derivability problems can be translated into Datalog with only a quadratic increase of size. © 2013 Elsevier Inc.
BibTeX:
@article{blass-abstract-2013,
  author = {Blass, A. and Gurevich, Y.},
  title = {Abstract Hilbertian deductive systems, infon logic, and Datalog},
  journal = {INFORMATION AND COMPUTATION},
  year = {2013},
  volume = {231},
  pages = {21--37}
}
Campbell, K. Abstract particulars 1990   book  
BibTeX:
@book{campbell-abstract-1990,
  author = {Campbell, Keith},
  title = {Abstract particulars},
  publisher = {B. Blackwell},
  year = {1990}
}
Ganascia, J.-G. Abstraction of levels of abstraction 2015 Journal of Experimental & Theoretical Artificial Intelligence
Vol. 27(1), pp. 23-35 
article  
Abstract: The notion of level of abstraction (LoA) is one of the foundations of the Floridi's Philosophy of Information. It also serves for many practical purposes as in information ethics. But the notion of abstraction is not new; it has been given many different meanings in various fields, especially in scientific disciplines and, in particular, in computer science. Our purpose here is to examine the use of abstraction in Floridi's works in conjunction with some of the meanings of abstraction in computer science. The article is divided into five sections. After a general introduction to the Floridi's method of abstraction (MoA) in Section 1, Section 2 revisits Floridi's definition of abstraction and Section 3 gives the different senses of abstraction in computer science. The Section 4 compares them with the Floridi's LoAs and proposes to generalise the Floridi's approach to abstraction using an abstraction of the LoAs, while Section 5 concludes on what we think to be some new arguments in favour of MoA and LoA.;The notion of level of abstraction (LoA) is one of the foundations of the Floridi's Philosophy of Information. It also serves for many practical purposes as in information ethics. But the notion of abstraction is not new; it has been given many different meanings in various fields, especially in scientific disciplines and, in particular, in computer science. Our purpose here is to examine the use of abstraction in Floridi's works in conjunction with some of the meanings of abstraction in computer science. The article is divided into five sections. After a general introduction to the Floridi's method of abstraction (MoA) in Section 1, Section 2 revisits Floridi's definition of abstraction and Section 3 gives the different senses of abstraction in computer science. The Section 4 compares them with the Floridi's LoAs and proposes to generalise the Floridi's approach to abstraction using an abstraction of the LoAs, while Section 5 concludes on what we think to be some new arguments in favour of MoA and LoA.; The notion of level of abstraction (LoA) is one of the foundations of the Floridi's Philosophy of Information. It also serves for many practical purposes as in information ethics. But the notion of abstraction is not new; it has been given many different meanings in various fields, especially in scientific disciplines and, in particular, in computer science. Our purpose here is to examine the use of abstraction in Floridi's works in conjunction with some of the meanings of abstraction in computer science. The article is divided into five sections. After a general introduction to the Floridi's method of abstraction (MoA) in Section 1, Section 2 revisits Floridi's definition of abstraction and Section 3 gives the different senses of abstraction in computer science. The Section 4 compares them with the Floridi's LoAs and proposes to generalise the Floridi's approach to abstraction using an abstraction of the LoAs, while Section 5 concludes on what we think to be some new arguments in favour of MoA and LoA.;
BibTeX:
@article{ganascia-abstraction-2015,
  author = {Ganascia, Jean-Gabriel},
  title = {Abstraction of levels of abstraction},
  journal = {Journal of Experimental & Theoretical Artificial Intelligence},
  year = {2015},
  volume = {27},
  number = {1},
  pages = {23--35}
}
Miller, P. and Power, M. Accounting, Organizing, and Economizing: Connecting Accounting Research and Organization Theory 2013 ACADEMY OF MANAGEMENT ANNALS
Vol. 7(1), pp. 557-605 
article  
Abstract: This paper encourages scholars of management to pay attention to the mutually constitutive nature of accounting, organizing, and economizing. This means viewing accounting as much more than an instrumental and purely technical activity. We identify four key roles of accounting: first, territorializing, the recursive construction of the calculable spaces that actors inhabit within organizations and society; second, mediating, that much of what accounting instruments and ideas do is to link up distinct actors, aspiration, and arenas; third, adjudicating, that accounting plays a decisive role in evaluating the performance of individuals and organizations, and also in determining failings and failures; and fourth, that accounting is a subjectivizing practice par excellence, that it both subjects individuals to control or regulation by another, while entailing the presumption of an individual free to choose. The entanglement of these four roles, we suggest, is what gives the accounting complex its productive force, such that it is perhaps the most powerful system of representation for social and economic life today in many national settings. We examine these issues through a selective review of the accounting literature based on the construction of two intellectual histories. One deals with the growth of scholarly interest in organizations which created the conditions for a behavioral turn in accounting research and the embedding of accounting within management scholarship. The other schematic history deals with the emergence of normative accounting pedagogy and theory from practice. This was challenged by an empirical revolution drawing on the methods of analytical economics which was broadly market-based, facing away from management. We argue for a third body of work which reacts to the reductionism of both and which focuses on the processes by which accounting representations and metrics are simultaneously powerful interventions which shape people, practices, and organizations. We suggest that accounting is a mechanism by which the economization of organizational life becomes elaborated and institutionalized.
BibTeX:
@article{miller-accounting-2013,
  author = {Miller, P. and Power, M.},
  title = {Accounting, Organizing, and Economizing: Connecting Accounting Research and Organization Theory},
  journal = {ACADEMY OF MANAGEMENT ANNALS},
  year = {2013},
  volume = {7},
  number = {1},
  pages = {557--605}
}
Rorty, R. Achieving our country: leftist thought in twentieth-century America 1998
Vol. 1997. 
book  
BibTeX:
@book{rorty-achieving-1998,
  author = {Rorty, Richard},
  title = {Achieving our country: leftist thought in twentieth-century America},
  publisher = {Harvard University Press},
  year = {1998},
  volume = {1997.}
}
Lunin, O. and Mathur, S.D. AdS/CFT duality and the black hole information paradox 2002
Vol. 623(1), pp. 342-394 
article  
BibTeX:
@article{lunin-ads/cft-2002,
  author = {Lunin, Oleg and Mathur, Samir D.},
  title = {AdS/CFT duality and the black hole information paradox},
  year = {2002},
  volume = {623},
  number = {1},
  pages = {342--394}
}
Grünwald, P.D., Myung, I.J. and Pitt, M.A. Advances in minimum description length: theory and applications 2005   book  
BibTeX:
@book{grunwald-advances-2005,
  author = {Grünwald, Peter D. and Myung, In J. and Pitt, Mark A.},
  title = {Advances in minimum description length: theory and applications},
  publisher = {MIT},
  year = {2005}
}
Zangwill, N. Aesthetic Judgment 2014 The Stanford Encyclopedia of Philosophy  incollection URL 
BibTeX:
@incollection{zangwill-aesthetic-2014,
  author = {Zangwill, Nick},
  title = {Aesthetic Judgment},
  booktitle = {The Stanford Encyclopedia of Philosophy},
  publisher = {Metaphysics Research Lab, Stanford University},
  year = {2014},
  edition = {Fall 2014},
  url = {https://plato.stanford.edu/archives/fall2014/entries/aesthetic-judgment/}
}
Göcke, B.P. After physicalism 2012   book  
BibTeX:
@book{gocke-after-2012,
  author = {Göcke, Benedikt P.},
  title = {After physicalism},
  year = {2012}
}
Floridi, L. Afterword LIS as applied philosophy of information: A reappraisal 2004 Library Trends
Vol. 52(3), pp. 658-667 
article  
Abstract: Floridi proposes that library information science should develop its foundation in terms of a philosophy of information (PI). He believes that PI should seek to explain a very wide range of phenomena and practices because its aim is foundationalist.; Floridi proposes that library information science should develop its foundation in terms of a philosophy of information (PI). He believes that PI should seek to explain a very wide range of phenomena and practices because its aim is foundationalist.
BibTeX:
@article{floridi-afterword-2004,
  author = {Floridi, Luciano},
  title = {Afterword LIS as applied philosophy of information: A reappraisal},
  journal = {Library Trends},
  year = {2004},
  volume = {52},
  number = {3},
  pages = {658--667}
}
Floridi, L. Against Digital Ontology 2009 Synthese
Vol. 168(1), pp. 151-178 
article  
Abstract: The paper argues that digital ontology (the ultimate nature of reality is digital, and the universe is a computational system equivalent to a Turing Machine) should be carefully distinguished from informational ontology (the ultimate nature of reality is structural), in order to abandon the former and retain only the latter as a promising line of research. Digital vs. analogue is a Boolean dichotomy typical of our computational paradigm, but digital and analogue are only "modes of presentation" of Being (to paraphrase Kant), that is, ways in which reality is experienced or conceptualised by an epistemic agent at a given level of abstraction. A preferable alternative is provided by an informational approach to structural realism, according to which knowledge of the world is knowledge of its structures. The most reasonable ontological commitment turns out to be in favour of an interpretation of reality as the totality of structures dynamically interacting with each other. The paper is the first part (the pars destruens) of a two-part piece of research. The pars construens, entitled "A Defence of Informational Structural Realism", is developed in a separate article, also published in this journal.;The paper argues that digital ontology (the ultimate nature of reality is digital, and the universe is a computational system equivalent to a Turing Machine) should be carefully distinguished from informational ontology (the ultimate nature of reality is structural), in order to abandon the former and retain only the latter as a promising line of research. Digital vs. analogue is a Boolean dichotomy typical of our computational paradigm, but digital and analogue are only “modes of presentation” of Being (to paraphrase Kant), that is, ways in which reality is experienced or conceptualised by an epistemic agent at a given level of abstraction. A preferable alternative is provided by an informational approach to structural realism, according to which knowledge of the world is knowledge of its structures. The most reasonable ontological commitment turns out to be in favour of an interpretation of reality as the totality of structures dynamically interacting with each other. The paper is the first part (the pars destruens) of a two-part piece of research. The pars construens, entitled “A Defence of Informational Structural Realism”, is developed in a separate article, also published in this journal.;The paper argues that digital ontology (the ultimate nature of reality is digital, and the universe is a computational system equivalent to a Turing Machine) should be carefully distinguished from informational ontology (the ultimate nature of reality is structural), in order to abandon the former and retain only the latter as a promising line of research. Digital vs. analogue is a Boolean dichotomy typical of our computational paradigm, but digital and analogue are only "modes of presentation" of Being (to paraphrase Kant), that is, ways in which reality is experienced or conceptualised by an epistemic agent at a given level of abstraction. A preferable alternative is provided by an informational approach to structural realism, according to which knowledge of the world is knowledge of its structures. The most reasonable ontological commitment turns out to be in favour of an interpretation of reality as the totality of structures dynamically interacting with each other. The paper is the first part (the pars destruens) of a two-part piece of research. The pars construens, entitled "A Defence of Informational Structural Realism", is developed in a separate article, also published in this journal.;The paper argues that digital ontology (the ultimate nature of reality is digital, and the universe is a computational system equivalent to a Turing Machine) should be carefully distinguished from informational ontology (the ultimate nature of reality is structural), in order to abandon the former and retain only the latter as a promising line of research. Digital vs. analogue is a Boolean dichotomy typical of our computational paradigm, but digital and analogue are only "modes of presentation" of Being (to paraphrase Kant), that is, ways in which reality is experienced or conceptualised by an epistemic agent at a given level of abstraction. A preferable alternative is provided by an informational approach to structural realism, according to which knowledge of the world is knowledge of its structures. The most reasonable ontological commitment turns out to be in favour of an interpretation of reality as the totality of structures dynamically interacting with each other. The paper is the first part (the pars destruens) of a two-part piece of research. The pars construens, entitled "A Defence of Informational Structural Realism", is developed in a separate article, also published in this journal.;
BibTeX:
@article{floridi-against-2009,
  author = {Floridi, Luciano},
  title = {Against Digital Ontology},
  journal = {Synthese},
  year = {2009},
  volume = {168},
  number = {1},
  pages = {151--178}
}
Maclaurin, J. Against reduction: A critical notice of Molecular models: philosophical papers on molecular biology by Sahotra Sarkar 2011 Biology & Philosophy
Vol. 26(1), pp. 151-158 
article  
Abstract: In Molecular Models: Philosophical Papers on Molecular Biology, Sahotra Sarkar presents a historical and philosophical analysis of four important themes in philosophy of science that have been influenced by discoveries in molecular biology. These are: reduction, function, information and directed mutation. I argue that there is an important difference between the cases of function and information and the more complex case of scientific reduction. In the former cases it makes sense to taxonomise important variations in scientific and philosophical usage of the terms “function” and “information”. However, the variety of usage of “reduction” across scientific disciplines (and across philosophy of science) makes such taxonomy inappropriate. Sarkar presents reduction as a set of facts about the world that science has discovered, but the facts in question are remarkably disparate; variously semantic, epistemic and ontological. I argue that the more natural conclusion of Sarkar’s analysis is eliminativism about reduction as a scientific concept.; In Molecular Models: Philosophical Papers on Molecular Biology, Sahotra Sarkar presents a historical and philosophical analysis of four important themes in philosophy of science that have been influenced by discoveries in molecular biology. These are: reduction, function, information and directed mutation. I argue that there is an important difference between the cases of function and information and the more complex case of scientific reduction. In the former cases it makes sense to taxonomise important variations in scientific and philosophical usage of the terms "function" and "information". However, the variety of usage of "reduction" across scientific disciplines (and across philosophy of science) makes such taxonomy inappropriate. Sarkar presents reduction as a set of facts about the world that science has discovered, but the facts in question are remarkably disparate; variously semantic, epistemic and ontological. I argue that the more natural conclusion of Sarkar's analysis is eliminativism about reduction as a scientific concept.[PUBLICATION ABSTRACT]; In Molecular Models: Philosophical Papers on Molecular Biology, Sahotra Sarkar presents a historical and philosophical analysis of four important themes in philosophy of science that have been influenced by discoveries in molecular biology. These are: reduction, function, information and directed mutation. I argue that there is an important difference between the cases of function and information and the more complex case of scientific reduction. In the former cases it makes sense to taxonomise important variations in scientific and philosophical usage of the terms "function" and "information". However, the variety of usage of "reduction" across scientific disciplines (and across philosophy of science) makes such taxonomy inappropriate. Sarkar presents reduction as a set of facts about the world that science has discovered, but the facts in question are remarkably disparate; variously semantic, epistemic and ontological. I argue that the more natural conclusion of Sarkar's analysis is eliminativism about reduction as a scientific concept.; In Molecular Models: Philosophical Papers on Molecular Biology, Sahotra Sarkar presents a historical and philosophical analysis of four important themes in philosophy of science that have been influenced by discoveries in molecular biology. These are: reduction, function, information and directed mutation. I argue that there is an important difference between the cases of function and information and the more complex case of scientific reduction. In the former cases it makes sense to taxonomise important variations in scientific and philosophical usage of the terms "function" and "information". However, the variety of usage of "reduction" across scientific disciplines (and across philosophy of science) makes such taxonomy inappropriate. Sarkar presents reduction as a set of facts about the world that science has discovered, but the facts in question are remarkably disparate; variously semantic, epistemic and ontological. I argue that the more natural conclusion of Sarkar's analysis is eliminativism about reduction as a scientific concept. Keywords Reduction, Function, Information, Eliminativism, Molecular
BibTeX:
@article{maclaurin-against-2011,
  author = {Maclaurin, James},
  title = {Against reduction: A critical notice of Molecular models: philosophical papers on molecular biology by Sahotra Sarkar},
  journal = {Biology & Philosophy},
  year = {2011},
  volume = {26},
  number = {1},
  pages = {151--158}
}
Lewis, D. Against structural universals 1986
Vol. 64(1), pp. 25-46 
article  
BibTeX:
@article{lewis-against-1986,
  author = {Lewis, David},
  title = {Against structural universals},
  year = {1986},
  volume = {64},
  number = {1},
  pages = {25--46}
}
Wimsatt, W.C. Aggregate, composed, and evolved systems: Reductionistic heuristics as means to more holistic theories 2006 Biology & Philosophy
Vol. 21(5), pp. 667-702 
article  
Abstract: Richard Levins’ distinction between aggregate, composed and evolved systems acquires new significance as we recognize the importance of mechanistic explanation. Criteria for aggregativity provide limiting cases for absence of organization, so through their failure, can provide rich detectors for organizational properties. I explore the use of failures of aggregativity for the analysis of mechanistic systems in diverse contexts. Aggregativity appears theoretically desireable, but we are easily fooled. It may be exaggerated through approximation, conditions of derivation, and extrapolating from some conditions of decomposition illegtimately to others. Evolved systems particularly may require analyses under alternative complementary decompositions. Exploring these conditions helps us to better understand the strengths and limits of reductionistic methods.;Richard Levins' distinction between aggregate, composed and evolved systems acquires new significance as we recognize the importance of mechanistic explanation. Criteria for aggregativity provide limiting cases for absence of organization, so through their failure, can provide rich detectors for organizational properties. I explore the use of failures of aggregativity for the analysis of mechanistic systems in diverse contexts. Aggregativity appears theoretically desireable, but we are easily fooled. It may be exaggerated through approximation, conditions of derivation, and extrapolating from some conditions of decomposition illegtimately to others. Evolved systems particularly may require analyses under alternative complementary decompositions. Exploring these conditions helps us to better understand the strengths and limits of reductionistic methods.;
BibTeX:
@article{wimsatt-aggregate-2006,
  author = {Wimsatt, William C.},
  title = {Aggregate, composed, and evolved systems: Reductionistic heuristics as means to more holistic theories},
  journal = {Biology & Philosophy},
  year = {2006},
  volume = {21},
  number = {5},
  pages = {667--702}
}
Wimsatt, W.C. Aggregativity: reductive heuristics for finding emergence 1997 Philosophy of Science
Vol. 64(4), pp. S372 
article  
Abstract: Most philosophical accounts of emergence are incompatible with reduction. Most scientists regard a system property as emergent relative to properties of the system's parts if it depends upon their mode of organization–a view consistent with reduction.
BibTeX:
@article{wimsatt-aggregativity:-1997,
  author = {Wimsatt, William C.},
  title = {Aggregativity: reductive heuristics for finding emergence},
  journal = {Philosophy of Science},
  year = {1997},
  volume = {64},
  number = {4},
  pages = {S372}
}
Kythe, D.K. Algebraic and Stochastic Coding Theory 2012   book  
BibTeX:
@book{kythe-algebraic-2012,
  author = {Kythe, Dave K.},
  title = {Algebraic and Stochastic Coding Theory},
  publisher = {CRC Press},
  year = {2012}
}
Zurek, W.H. Algorithmic information content, Church-Turing thesis, physical entropy, and Maxwell's demon 1990   inproceedings  
Abstract: Measurements convert alternative possibilities of its potential outcomes into the definiteness of the record” – data describing the actual outcome. The resulting decrease of statistical entropy has been, since the inception of the Maxwell's demon, regarded as a threat to the second law of thermodynamics. For, when the statistical entropy is employed as the measure of the useful work which can be extracted from the system, its decrease by the information gathering actions of the observer would lead one to believe that, at least from the observer's viewpoint, the second law can be violated. I show that the decrease of ignorance does not necessarily lead to the lowering of disorder of the measured physical system. Measurements can only convert uncertainty (quantified by the statistical entropy) into randomness of the outcome (given by the algorithmic information content of the data). The ability to extract useful work is measured by physical entropy, which is equal to the sum of these two measures of disorder. So defined physical entropy is, on the average, constant in course of the measurements carried out by the observer on an equilibrium system. 27 refs., 6 figs.
BibTeX:
@inproceedings{zurek-algorithmic-1990,
  author = {Zurek, W. H.},
  title = {Algorithmic information content, Church-Turing thesis, physical entropy, and Maxwell's demon},
  publisher = {DOE/AD},
  year = {1990}
}
Chaitin, G.J. Algorithmic information theory 1987
Vol. 1 
book  
BibTeX:
@book{chaitin-algorithmic-1987,
  author = {Chaitin, Gregory J.},
  title = {Algorithmic information theory},
  publisher = {Cambridge University Press},
  year = {1987},
  volume = {1}
}
Grünwald, P.D. and Vitányi, P.M.B. Algorithmic Information Theory 2008 , pp. 281-317  incollection  
BibTeX:
@incollection{grunwald-algorithmic-2008,
  author = {Grünwald, Peter D. and Vitányi, Paul M. B.},
  title = {Algorithmic Information Theory},
  year = {2008},
  pages = {281--317}
}
Chen, M. and Floridi, L. An analysis of information visualisation 2013 Synthese
Vol. 190(16), pp. 3421-3438 
article  
Abstract: Philosophers have relied on visual metaphors to analyse ideas and explain their theories at least since Plato. Descartes is famous for his system of axes, and Wittgenstein for his first design of truth table diagrams. Today, visualisation is a form of ‘computer-aided seeing’ information in data. Hence, information is the fundamental ‘currency’ exchanged through a visualisation pipeline. In this article, we examine the types of information that may occur at different stages of a general visualization pipeline. We do so from a quantitative and a qualitative perspective. The quantitative analysis is developed on the basis of Shannon’s information theory. The qualitative analysis is developed on the basis of Floridi’s taxonomy in the philosophy of information. We then discuss in detail how the condition of the ‘data processing inequality’ can be broken in a visualisation pipeline. This theoretic finding underlines the usefulness and importance of visualisation in dealing with the increasing problem of data deluge. We show that the subject of visualisation should be studied using both qualitative and quantitative approaches, preferably in an interdisciplinary synergy between information theory and the philosophy of information.;Philosophers have relied on visual metaphors to analyse ideas and explain their theories at least since Plato. Descartes is famous for his system of axes, and Wittgenstein for his first design of truth table diagrams. Today, visualisation is a form of 'computer-aided seeing' information in data. Hence, information is the fundamental 'currency' exchanged through a visualisation pipeline. In this article, we examine the types of information that may occur at different stages of a general visualization pipeline. We do so from a quantitative and a qualitative perspective. The quantitative analysis is developed on the basis of Shannon's information theory. The qualitative analysis is developed on the basis of Floridi's taxonomy in the philosophy of information. We then discuss in detail how the condition of the 'data processing inequality' can be broken in a visualisation pipeline. This theoretic finding underlines the usefulness and importance of visualisation in dealing with the increasing problem of data deluge. We show that the subject of visualisation should be studied using both qualitative and quantitative approaches, preferably in an interdisciplinary synergy between information theory and the philosophy of information.; Philosophers have relied on visual metaphors to analyse ideas and explain their theories at least since Plato. Descartes is famous for his system of axes, and Wittgenstein for his first design of truth table diagrams. Today, visualisation is a form of 'computer-aided seeing' information in data. Hence, information is the fundamental 'currency' exchanged through a visualisation pipeline. In this article, we examine the types of information that may occur at different stages of a general visualization pipeline. We do so from a quantitative and a qualitative perspective. The quantitative analysis is developed on the basis of Shannon's information theory. The qualitative analysis is developed on the basis of Floridi's taxonomy in the philosophy of information. We then discuss in detail how the condition of the 'data processing inequality' can be broken in a visualisation pipeline. This theoretic finding underlines the usefulness and importance of visualisation in dealing with the increasing problem of data deluge. We show that the subject of visualisation should be studied using both qualitative and quantitative approaches, preferably in an interdisciplinary synergy between information theory and the philosophy of information.[PUBLICATION ABSTRACT];
BibTeX:
@article{chen-analysis-2013,
  author = {Chen, Min and Floridi, Luciano},
  title = {An analysis of information visualisation},
  journal = {Synthese},
  year = {2013},
  volume = {190},
  number = {16},
  pages = {3421--3438}
}
Franklin, J. An aristotelian realist philosophy of mathematics: mathematics as the science of quantity and structure 2014   book  
BibTeX:
@book{franklin-aristotelian-2014,
  author = {Franklin, James},
  title = {An aristotelian realist philosophy of mathematics: mathematics as the science of quantity and structure},
  publisher = {Palgrave Macmillan},
  year = {2014}
}
Joyce, P., Rokyta, D.R., Wichman, H.A. and Caudle, S.B. An empirical test of the mutational landscape model of adaptation using a single-stranded DNA virus 2005 Nature genetics
Vol. 37(4), pp. 441-444 
article  
Abstract: The primary impediment to formulating a general theory for adaptive evolution has been the unknown distribution of fitness effects for new beneficial mutations. By applying extreme value theory, Gillespie circumvented this issue in his mutational landscape model for the adaptation of DNA sequences, and Orr recently extended Gillespie's model, generating testable predictions regarding the course of adaptive evolution. Here we provide the first empirical examination of this model, using a single-stranded DNA bacteriophage related to X174, and find that our data are consistent with Orr's predictions, provided that the model is adjusted to incorporate mutation bias. Orr's work suggests that there may be generalities in adaptive molecular evolution that transcend the biological details of a system, but we show that for the model to be useful as a predictive or inferential tool, some adjustments for the biology of the system will be necessary.; The primary impediment to formulating a general theory for adaptive evolution has been the unknown distribution of fitness effects for new beneficial mutations(1). By applying extreme value theory(2), Gillespie circumvented this issue in his mutational landscape model for the adaptation of DNA sequences(3-5), and Orr recently extended Gillespie's model(1,6), generating testable predictions regarding the course of adaptive evolution. Here we provide the first empirical examination of this model, using a single-stranded DNA bacteriophage related to phi X174, and find that our data are consistent with Orr's predictions, provided that the model is adjusted to incorporate mutation bias. Orr's work suggests that there may be generalities in adaptive molecular evolution that transcend the biological details of a system, but we show that for the model to be useful as a predictive or inferential tool, some adjustments for the biology of the system will be necessary.; The primary impediment to formulating a general theory for adaptive evolution has been the unknown distribution of fitness effects for new beneficial mutations. By applying extreme value theory, Gillespie circumvented this issue in his mutational landscape model for the adaptation of DNA sequences, and Orr recently extended Gillespie's model, generating testable predictions regarding the course of adaptive evolution. Here we provide the first empirical examination of this model, using a single-stranded DNA bacteriophage related to phiX174, and find that our data are consistent with Orr's predictions, provided that the model is adjusted to incorporate mutation bias. Orr's work suggests that there may be generalities in adaptive molecular evolution that transcend the biological details of a system, but we show that for the model to be useful as a predictive or inferential tool, some adjustments for the biology of the system will be necessary.; The primary impediment to formulating a general theory for adaptive evolution has been the unknown distribution of fitness effects for new beneficial mutations. By applying extreme value theory, Gillespie circumvented this issue in his mutational landscape model for the adaptation of DNA sequences, and Orr recently extended Gillespie's model, generating testable predictions regarding the course of adaptive evolution. Here we provide the first empirical examination of this model, using a single-stranded DNA bacteriophage related to phiX174, and find that our data are consistent with Orr's predictions, provided that the model is adjusted to incorporate mutation bias. Orr's work suggests that there may be generalities in adaptive molecular evolution that transcend the biological details of a system, but we show that for the model to be useful as a predictive or inferential tool, some adjustments for the biology of the system will be necessary.
BibTeX:
@article{joyce-empirical-2005,
  author = {Joyce, Paul and Rokyta, Darin R. and Wichman, Holly A. and Caudle, S. B.},
  title = {An empirical test of the mutational landscape model of adaptation using a single-stranded DNA virus},
  journal = {Nature genetics},
  year = {2005},
  volume = {37},
  number = {4},
  pages = {441--444}
}
Tribus, M. An Engineer Looks at Bayes 1988 Maximum-entropy and Bayesian methods in science and engineering  inproceedings  
BibTeX:
@inproceedings{tribus-engineer-1988,
  author = {Tribus, Myron},
  title = {An Engineer Looks at Bayes},
  booktitle = {Maximum-entropy and Bayesian methods in science and engineering},
  publisher = {Kluwer Academic Publishers},
  year = {1988}
}
Hume, D. An enquiry concerning human understanding 1748   book  
BibTeX:
@book{hume-enquiry-1748,
  author = {Hume, David},
  title = {An enquiry concerning human understanding},
  publisher = {Alex Catalogue},
  year = {1748}
}
Behin-Ain, S., van Doorn, T. and Patterson, J.R. An indeterministic Monte Carlo technique for fast time of flight photon transport through optically thick turbid media 2002 Medical physics
Vol. 29(2), pp. 125-131 
article  
Abstract: A time-resolved indeterministic Monte Carlo (IMC) simulation technique is proposed for the efficient construction of the early part of the temporal point spread function (TPSF) of visible or near infrared photons transmitted through an optically thick scattering medium. By assuming a detected photon is a superposition of photon components, the photon is repropagated from a point in the original path where a significant delay in forward propagation occurred. A weight is then associated with each subsequently detected photon to compensate for shorter components. The technique is shown to reduce the computation time by a factor of at least 4 when simulating the sub-200 picosecond region of the TPSF and hence provides a useful tool for analysis of single photon detection in transillumination imaging.; A time-resolved indeterministic Monte Carlo (IMC) simulation technique is proposed for the efficient construction of the early part of the temporal point spread function (TPSF) of visible or near infrared photons transmitted through an optically thick scattering medium. By assuming a detected photon is a superposition of photon components, the photon is repropagated from a point in the original path where a significant delay in forward propagation occurred. A weight is then associated with each subsequently detected photon to compensate for shorter components. The technique is shown to reduce the computation time by a factor of at least 4 when simulating the sub-200 picosecond region of the TPSF and hence provides a useful tool for analysis of single photon detection in transillumination imaging. (C) 2002 American Association of Physicists in Medicine.; A time-resolved indeterministic Monte Carlo (IMC) simulation technique is proposed for the efficient construction of the early part of the temporal point spread function (TPSF) of visible or near infrared photons transmitted through an optically thick scattering medium. By assuming a detected photon is a superposition of photon components, the photon is repropagated from a point in the original path where a significant delay in forward propagation occurred. A weight is then associated with each subsequently detected photon to compensate for shorter components. The technique is shown to reduce the computation time by a factor of at least 4 when simulating the sub-200 picosecond region of the TPSF and hence provides a useful tool for analysis of single photon detection in transillumination imaging.
BibTeX:
@article{behin-ain-indeterministic-2002,
  author = {Behin-Ain, S. and van Doorn, T. and Patterson, J. R.},
  title = {An indeterministic Monte Carlo technique for fast time of flight photon transport through optically thick turbid media},
  journal = {Medical physics},
  year = {2002},
  volume = {29},
  number = {2},
  pages = {125--131}
}
Bueno, O. and Colyvan, M. An Inferential Conception of the Application of Mathematics 2011 Noûs
Vol. 45(2), pp. 345-374 
article  
Abstract: A number of people have recently argued for a structural approach to accounting for the applications of mathematics. Such an approach has been called “the mapping account”. According to this view, the applicability of mathematics is fully accounted for by appreciating the relevant structural similarities between the empirical system under study and the mathematics used in the investigation of that system. This account of applications requires the truth of applied mathematical assertions, but it does not require the existence of mathematical objects. In this paper, we discuss the shortcomings of this account, and show how these shortcomings can be overcome by a broader view of the application of mathematics: the inferential conception.;A number of people have recently argued for a structural approach to accounting for the applications of mathematics. Such an approach has been called "the mapping account". According to this view, the applicability of mathematics is fully accounted for by appreciating the relevant structural similarities between the empirical system under study and the mathematics used in the investigation ofthat system. This account of applications requires the truth of applied mathematical assertions, but it does not require the existence of mathematical objects. In this paper, we discuss the shortcomings of this account, and show how these shortcomings can be overcome by a broader view of the application of mathematics: the inferential conception.;A number of people have recently argued for a structural approach to accounting for the applications of mathematics. Such an approach has been called "the mapping account". According to this view, the applicability of mathematics is fully accounted for by appreciating the relevant structural similarities between the empirical system under study and the mathematics used in the investigation of that system. This account of applications requires the truth of applied mathematical assertions, but it does not require the existence of mathematical objects. In this paper, we discuss the shortcomings of this account, and show how these shortcomings can be overcome by a broader view of the application of mathematics: the inferential conception.;A number of people have recently argued for a structural approach to accounting for the applications of mathematics. Such an approach has been called "the mapping account". According to this view, the applicability of mathematics is fully accounted for by appreciating the relevant structural similarities between the empirical system under study and the mathematics used in the investigation of that system. This account of applications requires the truth of applied mathematical assertions, but it does not require the existence of mathematical objects. In this paper, we discuss the shortcomings of this account, and show how these shortcomings can be overcome by a broader view of the application of mathematics: the inferential conception.;A number of people have recently argued for a structural approach to accounting for the applications of mathematics. Such an approach has been called "the mapping account". According to this view, the applicability of mathematics is fully accounted for by appreciating the relevant structural similarities between the empirical system under study and the mathematics used in the investigation of that system. This account of applications requires the truth of applied mathematical assertions, but it does not require the existence of mathematical objects. In this paper, we discuss the shortcomings of this account, and show how these shortcomings can be overcome by a broader view of the application of mathematics: the inferential conception. [PUBLICATION ABSTRACT];
BibTeX:
@article{bueno-inferential-2011,
  author = {Bueno, Otávio and Colyvan, Mark},
  title = {An Inferential Conception of the Application of Mathematics},
  journal = {Noûs},
  year = {2011},
  volume = {45},
  number = {2},
  pages = {345--374}
}
Tononi, G. An information integration theory of consciousness 2004 BMC Neuroscience
Vol. 5(1), pp. 42 
article DOI URL 
Abstract: Consciousness poses two main problems. The first is understanding the conditions that determine to what extent a system has conscious experience. For instance, why is our consciousness generated by certain parts of our brain, such as the thalamocortical system, and not by other parts, such as the cerebellum? And why are we conscious during wakefulness and much less so during dreamless sleep? The second problem is understanding the conditions that determine what kind of consciousness a system has. For example, why do specific parts of the brain contribute specific qualities to our conscious experience, such as vision and audition?
BibTeX:
@article{tononi-information-2004,
  author = {Tononi, Giulio},
  title = {An information integration theory of consciousness},
  journal = {BMC Neuroscience},
  year = {2004},
  volume = {5},
  number = {1},
  pages = {42},
  url = {http://dx.doi.org/10.1186/1471-2202-5-42},
  doi = {http://doi.org/10.1186/1471-2202-5-42}
}
Millikan, R.G. An Input Condition for Teleosemantics? Reply to Shea (And Godfrey-Smith) 2007 Philosophy and Phenomenological Research
Vol. 75(2), pp. 436-455 
article  
BibTeX:
@article{millikan-input-2007,
  author = {Millikan, Ruth G.},
  title = {An Input Condition for Teleosemantics? Reply to Shea (And Godfrey-Smith)},
  journal = {Philosophy and Phenomenological Research},
  year = {2007},
  volume = {75},
  number = {2},
  pages = {436--455}
}
Borrill, P.L. An Insight into Information, Entanglement and Time 2015 It From Bit or Bit From It?, pp. 97-112  incollection  
BibTeX:
@incollection{borrill-insight-2015,
  author = {Borrill, Paul L},
  title = {An Insight into Information, Entanglement and Time},
  booktitle = {It From Bit or Bit From It?},
  publisher = {Springer},
  year = {2015},
  pages = {97--112}
}
Teller, P. An interpretive introduction to quantum field theory 1995   book  
BibTeX:
@book{teller-interpretive-1995,
  author = {Teller, Paul},
  title = {An interpretive introduction to quantum field theory},
  publisher = {Princeton University Press},
  year = {1995}
}
Susskind, L. and Lindesay, J. An introduction to black holes, information and the string theory revolution: the holographic universe 2005   book  
BibTeX:
@book{susskind-introduction-2005,
  author = {Susskind, Leonard and Lindesay, James},
  title = {An introduction to black holes, information and the string theory revolution: the holographic universe},
  publisher = {World Scientific},
  year = {2005}
}
Fredkin, E. An Introduction to Digital Philosophy 2003 International Journal of Theoretical Physics
Vol. 42(2), pp. 189-247 
article  
Abstract: Digital Philosophy (DP) is a new way of thinking about how things work. This paper can be viewed as a continuation of the author's work of 1990[3]; it is based on the general concept of replacing normal mathematical models, such as partial differential equations, with Digital Mechanics (DM). DP is based on two concepts: bits, like the binary digits in a computer, correspond to the most microscopic representation of state information; and the temporal evolution of state is a digital informational process similar to what goes on in the circuitry of a computer processor. We are motivated in this endeavor by the remarkable clarification that DP seems able to provide with regard to many of the most fundamental questions about processes we observe in our world.
BibTeX:
@article{fredkin-introduction-2003,
  author = {Fredkin, Edward},
  title = {An Introduction to Digital Philosophy},
  journal = {International Journal of Theoretical Physics},
  year = {2003},
  volume = {42},
  number = {2},
  pages = {189--247}
}
Grazioso, F. An introduction to information theory and some of its applications: black hole information paradox and renormalization group information flow 2015 Canadian Journal of Physics
Vol. 93(9) 
article  
BibTeX:
@article{grazioso-introduction-2015,
  author = {Grazioso, Fabio},
  title = {An introduction to information theory and some of its applications: black hole information paradox and renormalization group information flow},
  journal = {Canadian Journal of Physics},
  year = {2015},
  volume = {93},
  number = {9}
}
Vitanyi, P.M.B. and Li, M. An Introduction to Komogorov Complexity and its Applications 2009   book URL 
BibTeX:
@book{vitanyi-introduction-2009,
  author = {Vitanyi, Paul M. B. and Li, Ming},
  title = {An Introduction to Komogorov Complexity and its Applications},
  publisher = {Springer Science & Business Media},
  year = {2009},
  url = {https://books.google.com.au/books?id=25fue3UYDN0C}
}
Wiltshire, D.L. An introduction to quantum cosmology 2000   article  
Abstract: This is an introductory set of lecture notes on quantum cosmology, given in 1995 to an audience with interests ranging from astronomy to particle physics. Topics covered: 1. Introduction: 1.1 Quantum cosmology and quantum gravity; 1.2 A brief history of quantum cosmology. 2. Hamiltonian formulation of general relativity: 2.1 The 3+1 decomposition; 2.2 The action. 3. Quantisation: 3.1 Superspace; 3.2 Canonical quantisation; 3.3 Path integral quantisation; 3.4 Minisuperspace; 3.5 The WKB approximation; 3.6 Probability measures; 3.7 Minisuperspace for the Friedmann universe with massive scalar field. 4. Boundary Conditions: 4.1 The no-boundary proposal; 4.2 The tunneling proposal. 5. The predictions of quantum cosmology: 5.1 The period of inflation; 5.2 The origin of density perturbations; 5.3 The arrow of time.
BibTeX:
@article{wiltshire-introduction-2000,
  author = {Wiltshire, D. L.},
  title = {An introduction to quantum cosmology},
  year = {2000}
}
Colyvan, M. An introduction to the philosophy of mathematics 2012   book  
BibTeX:
@book{colyvan-introduction-2012,
  author = {Colyvan, Mark},
  title = {An introduction to the philosophy of mathematics},
  publisher = {Cambridge University Press},
  year = {2012}
}
Davies, E.B. and Lewis, J.T. An operational approach to quantum probability 1970 Communications in Mathematical Physics
Vol. 17(3), pp. 239-260 
article DOI URL 
Abstract: In order to provide a mathmatical framework for the process of making repeated measurements on continuous observables in a statistical system we make a mathematical definition of an instrument, a concept which generalises that of an observable and that of an operation. It is then possible to develop such notions as joint and conditional probabilities without any of the commutation conditions needed in the approach via observables. One of the crucial notions is that of repeatability which we show is implicitly assumed in most of the axiomatic treatments of quantum mechanics, but whose abandonment leads to a much more flexible approach to measurement theory.
BibTeX:
@article{davies-operational-1970,
  author = {Davies, E. B. and Lewis, J. T.},
  title = {An operational approach to quantum probability},
  journal = {Communications in Mathematical Physics},
  year = {1970},
  volume = {17},
  number = {3},
  pages = {239--260},
  url = {http://dx.doi.org/10.1007/BF01647093},
  doi = {http://doi.org/10.1007/BF01647093}
}
Carnap, R. and Bar-Hillel, Y. An Outline of a Theory of Semantic Information 1952 (247)School: RESEARCH LABORATORY OF ELECTRONICS MASSACHUSETTS INSTITUTE OF TECHNOLOGY  techreport  
BibTeX:
@techreport{carnap-outline-1952,
  author = {Carnap, Rudolf and Bar-Hillel, Yehoshua},
  title = {An Outline of a Theory of Semantic Information},
  school = {RESEARCH LABORATORY OF ELECTRONICS MASSACHUSETTS INSTITUTE OF TECHNOLOGY},
  year = {1952},
  number = {247}
}
Floredi, L. Ancient scepticism and the sceptical tradition 2001
Vol. 39(4) 
book  
Abstract: "Ancient Scepticism and the Sceptical Tradition" edited by Juha Sihvola is reviewed.
BibTeX:
@book{floredi-ancient-2001,
  author = {Floredi, L.},
  title = {Ancient scepticism and the sceptical tradition},
  year = {2001},
  volume = {39},
  number = {4}
}
Petty William, S. Another essay in political arithmetick, concerning the growth of the city of London: with the measures, periods, causes, and consequences thereof : 1682 1683   book  
BibTeX:
@book{petty-another-1683,
  author = {Petty, William, Sir},
  title = {Another essay in political arithmetick, concerning the growth of the city of London: with the measures, periods, causes, and consequences thereof : 1682},
  publisher = {Printed by H.H. for Mark Pardoe},
  year = {1683}
}
Cao, T.Y. Appendix: Ontological Relativity and Fundamentality: Is QFT the Fundamental Theory? 2003 Synthese
Vol. 136(1), pp. 25-30 
article  
BibTeX:
@article{cao-appendix:-2003,
  author = {Cao, Tian Y.},
  title = {Appendix: Ontological Relativity and Fundamentality: Is QFT the Fundamental Theory?},
  journal = {Synthese},
  year = {2003},
  volume = {136},
  number = {1},
  pages = {25--30}
}
Ryabko, B. Applications of Universal Source Coding to Statistical Analysis of Time Series 2008 CoRR
Vol. abs/0809.1226 
article URL 
BibTeX:
@article{ryabko-applications-2008,
  author = {Ryabko, Boris},
  title = {Applications of Universal Source Coding to Statistical Analysis of Time Series},
  journal = {CoRR},
  year = {2008},
  volume = {abs/0809.1226},
  url = {http://arxiv.org/abs/0809.1226}
}
Oriti, D. Approaches to quantum gravity: toward a new understanding of space, time and matter 2009   book  
BibTeX:
@book{oriti-approaches-2009,
  author = {Oriti, Daniele},
  title = {Approaches to quantum gravity: toward a new understanding of space, time and matter},
  publisher = {Cambridge University Press},
  year = {2009}
}
Adriaans, P., Adriaans, P., Vitanyi, P.M.B. and Vitanyi, P. Approximation of the Two-Part MDL Code 2009 IEEE Transactions on Information Theory
Vol. 55(1), pp. 444-457 
article  
Abstract: Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: (i) computation of each step may take arbitrarily long; (ii) we may not know when we reach the optimum, or whether we will reach the optimum at all; (iii) the sequence of models generated may not monotonically improve the goodness of fit; but (iv) the model associated with the optimum has (almost) the best goodness of fit. To express the practically interesting goodness of fit of individual models for individual data sets we have to rely on Kolmogorov complexity.; Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: i) computation of each step may take arbitrarily long; ii) we may not know when we reach the optimum, or whether we will reach the optimum at all; iii) the sequence of models generated may not monotonically improve the goodness of fit; but iv) the model associated with the optimum has (almost) the best goodness of fit. To express the practically interesting goodness of fit of individual models for individual data sets we have to rely on Kolmogorov complexity.; Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: (i) computation of each step may take arbitrarily long; (ii) we may not know when we reach the optimum, or whether we will reach the optimum at all; (iii) the sequence of models generated may not monotonically improve the goodness of fit; but (iv) the model associated with the optimum has (almost) the best goodness of fit. To express the practically interesting goodness of fit of individual models for individual data sets we have to rely on Kolmogorov complexity.; Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: i) computation of each step may take arbitrarily long; ii) we may not know when we reach the optimum, or whether we will reach the optimum at all; iii) the sequence of models generated may not monotonically improve the goodness of fit; but iv) the model associated with the optimum has (almost) the best goodness of fit. To express the practically interesting goodness of fit of individual models for individual data sets we have to rely on Kolmogorov complexity.; Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: i) computation of each step may take arbitrarily long; ii) we may not know when we reach the optimum, or whether we will reach the optimum at all; iii) the sequence of models generated may not monotonically improve the goodness of fit; but iv) the model associated with the optimum has (almost) the best goodness of fit. To express the practically interesting goodness of fit of individual models for individual data sets we have to rely on Kolmogorov complexity.; The authors describe the approximation of an optimal 2-part minimum description length (MDL) code for given data through successive monotonically length-decreasing 2-part MDL codes. Calculation of each step may take arbitrarily long. The sequence of models generated may not monotonically enhance the goodness of fit. The model related to the optimum has almost the best goodness of fit. Kolmogorov complexity is used to express the pratically interesting good of fit of individual models for individual data sets.; Approximation of the optimal two-part minimum description length (MDL) code for given data, through successive monotonically length-decreasing two-part MDL codes, has the following properties: i) computation of each step may take arbitrarily long; ii) we may not know when we reach the optimum, or whether we will reach the optimum at all; iii) the sequence of models generated may not monotonically improve the goodness of fit; but iv) the model associated with the optimum has (almost) the best goodness of fit. To express the practically interesting goodness of fit of individual models for individual data sets we have to rely on Kolmogorov complexity. [PUBLICATION ABSTRACT]
BibTeX:
@article{adriaans-approximation-2009,
  author = {Adriaans, Pieter and Adriaans, P. and Vitanyi, Paul M. B. and Vitanyi, P.},
  title = {Approximation of the Two-Part MDL Code},
  journal = {IEEE Transactions on Information Theory},
  year = {2009},
  volume = {55},
  number = {1},
  pages = {444--457}
}
Werndl, C. Are deterministic descriptions and indeterministic descriptions observationally equivalent? 2009 Studies in History and Philosophy of Modern Physics
Vol. 40(3), pp. 232-242 
article  
Abstract: The central question of this paper is: are deterministic and indeterministic descriptions observationally equivalent in the sense that they give the same predictions? I tackle this question for measure-theoretic deterministic systems and stochastic processes, both of which are ubiquitous in science. I first show that for many measure-theoretic deterministic systems there is a stochastic process which is observationally equivalent to the deterministic system. Conversely, I show that for all stochastic processes there is a measure-theoretic deterministic system which is observationally equivalent to the stochastic process. Still, one might guess that the measure-theoretic deterministic systems which are observationally equivalent to stochastic processes used in science do not include any deterministic systems used in science. I argue that this is not so because deterministic systems used in science even give rise to Bernoulli processes. Despite this, one might guess that measure-theoretic deterministic systems used in science cannot give the same predictions at every observation level as stochastic processes used in science. By proving results in ergodic theory, I show that also this guess is misguided: there are several deterministic systems used in science which give the same predictions at every observation level as Markov processes. All these results show that measure-theoretic deterministic systems and stochastic processes are observationally equivalent more often than one might perhaps expect. Furthermore, I criticize the claims of some previous philosophy papers on observational equivalence. (C) 2009 Elsevier Ltd. All rights reserved.; The central question of this paper is: are deterministic and indeterministic descriptions observationally equivalent in the sense that they give the same predictions? I tackle this question for measure-theoretic deterministic systems and stochastic processes, both of which are ubiquitous in science. I first show that for many measure-theoretic deterministic systems there is a stochastic process which is observationally equivalent to the deterministic system. Conversely, I show that for all stochastic processes there is a measure-theoretic deterministic system which is observationally equivalent to the stochastic process. Still, one might guess that the measure-theoretic deterministic systems which are observationally equivalent to stochastic processes used in science do not include any deterministic systems used in science. I argue that this is not so because deterministic systems used in science even give rise to Bernoulli processes. Despite this, one might guess that measure-theoretic deterministic systems used in science cannot give the same predictions at every observation level as stochastic processes used in science. By proving results in ergodic theory, I show that also this guess is misguided: there are several deterministic systems used in science which give the same predictions at every observation level as Markov processes. All these results show that measure-theoretic deterministic systems and stochastic processes are observationally equivalent more often than one might perhaps expect. Furthermore, I criticize the claims of some previous philosophy papers on observational equivalence. © 2009 Elsevier Ltd. All rights reserved.; The central question of this paper is: are deterministic and indeterministic descriptions observationally equivalent in the sense that they give the same predictions? I tackle this question for measure-theoretic deterministic systems and stochastic processes, both of which are ubiquitous in science. I first show that for many measure-theoretic deterministic systems there is a stochastic process which is observationally equivalent to the deterministic system. Conversely, I show that for all stochastic processes there is a measure-theoretic deterministic system which is observationally equivalent to the stochastic process. Still, one might guess that the measure-theoretic deterministic systems which are observationally equivalent to stochastic processes used in science do not include any deterministic systems used in science. I argue that this is not so because deterministic systems used in science even give rise to Bernoulli processes. Despite this, one might guess that measure-theoretic deterministic systems used in science cannot give the same predictions at every observation level as stochastic processes used in science. By proving results in ergodic theory, I show that also this guess is misguided: there are several deterministic systems used in science which give the same predictions at every observation level as Markov processes. All these results show that measure-theoretic deterministic systems and stochastic processes are observationally equivalent more often than one might perhaps expect. Furthermore, I criticize the claims of some previous philosophy papers on observational equivalence.
BibTeX:
@article{werndl-are-2009,
  author = {Werndl, Charlotte},
  title = {Are deterministic descriptions and indeterministic descriptions observationally equivalent?},
  journal = {Studies in History and Philosophy of Modern Physics},
  year = {2009},
  volume = {40},
  number = {3},
  pages = {232--242}
}
Baker, A. Are there Genuine Mathematical Explanations of Physical Phenomena? 2005 Mind
Vol. 114(454), pp. 223-238 
article  
Abstract: Many explanations in science make use of mathematics. But are there cases where the mathematical component of a scientific explanation is explanatory in its own right? This issue of mathematical explanations in science has been for the most part neglected. I argue that there are genuine mathematical explanations in science, and present in some detail an example of such an explanation, taken from evolutionary biology, involving periodical cicadas. I also indicate how the answer to my title question impacts on broader issues in the philosophy of mathematics; in particular it may help platonists respond to a recent challenge by Joseph Melia concerning the force of the Indispensability Argument. [PUBLICATION ABSTRACT];Many explanations in science make use of mathematics. But are there cases where the mathematical component of a scientific explanation is explanatory in its own right? This issue of mathematical explanations in science has been for the most part neglected. I argue that there are genuine mathematical explanations in science, and present in some detail an example of such an explanation, taken from evolutionary biology, involving periodical cicadas. I also indicate how the answer to my title question impacts on broader issues in the philosophy of mathematics; in particular it may help platonists respond to a recent challenge by Joseph Melia concerning the force of the Indispensability Argument.;
BibTeX:
@article{baker-are-2005,
  author = {Baker, Alan},
  title = {Are there Genuine Mathematical Explanations of Physical Phenomena?},
  journal = {Mind},
  year = {2005},
  volume = {114},
  number = {454},
  pages = {223--238}
}
Crick, F. and Koch, C. Are we aware of neural activity in primary visual cortex? 1995 Nature
Vol. 375 
article DOI URL 
BibTeX:
@article{crick-are-1995,
  author = {Crick, F. and Koch, C.},
  title = {Are we aware of neural activity in primary visual cortex?},
  journal = {Nature},
  year = {1995},
  volume = {375},
  url = {http://dx.doi.org/10.1038/375121a0},
  doi = {http://doi.org/10.1038/375121a0}
}
Bostrom, N. Are We Living in a Computer Simulation? 2003 The Philosophical Quarterly
Vol. 53(211), pp. 243-255 
article  
Abstract: I argue that at least one of the following propositions is true: (1) the human species is very likely to become extinct before reaching a 'posthuman' stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of its evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation. It follows that the belief that there is a significant chance that we shall one day become posthumans who run ancestor-simulations is false, unless we are currently living in a simulation. I discuss some consequences of this result.; I argue that at least one of the following propositions is true: (1) the human species is very likely to become extinct before reaching a 'posthuman' stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of its evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation. It follows that the belief that there is a significant chance that we shall one day become posthumans who run ancestor-simulations is false, unless we are currently living in a simulation. I discuss some consequences of this result.; I argue that at least one of the following propositions is true: (1) the human species is very likely to become extinct before reaching a "posthuman" stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of its evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation. It follows that the belief that there is a significant chance that we shall one day become posthumans who run ancestor–simulations is false, unless we are currently living in a simulation. I discuss some consequences of this result.; I argue that at least one of the following propositions is true: (1) the human species is very likely to become extinct before reaching a ‘posthuman’ stage; (2) any posthuman civilization is extremely unlikely to run a significant number of simulations of its evolutionary history (or variations thereof); (3) we are almost certainly living in a computer simulation. It follows that the belief that there is a significant chance that we shall one day become posthumans who run ancestor‐simulations is false, unless we are currently living in a simulation. I discuss some consequences of this result.
BibTeX:
@article{bostrom-are-2003,
  author = {Bostrom, Nick},
  title = {Are We Living in a Computer Simulation?},
  journal = {The Philosophical Quarterly},
  year = {2003},
  volume = {53},
  number = {211},
  pages = {243--255}
}
Leunissen, M. Aristotle's Physics: A Critical Guide 2015   book  
BibTeX:
@book{leunissen-aristotles-2015,
  author = {Leunissen, Mariska},
  title = {Aristotle's Physics: A Critical Guide},
  publisher = {Cambridge University Press},
  year = {2015}
}
Jackson, F. Armchair Metaphysics 1994 Philosophy in Mind, pp. 23-42  incollection  
BibTeX:
@incollection{jackson-armchair-1994,
  author = {Jackson, Frank},
  title = {Armchair Metaphysics},
  booktitle = {Philosophy in Mind},
  publisher = {Kluwer},
  year = {1994},
  pages = {23--42}
}
Hawkins, R.J. and Frieden, B.R. Asymmetric information and quantization in financial economics 2012 International Journal of Mathematics and Mathematical Sciences
Vol. 2012 
article  
BibTeX:
@article{hawkins-asymmetric-2012,
  author = {Hawkins, Raymond J. and Frieden, B. R.},
  title = {Asymmetric information and quantization in financial economics},
  journal = {International Journal of Mathematics and Mathematical Sciences},
  year = {2012},
  volume = {2012}
}
Page, D.N. Average entropy of a subsystem 1993 Physical Review Letters
Vol. 71(9), pp. 1291-1294 
article  
Abstract: If a quantum system of Hilbert space dimension mn is in a random pure state, the average entropy of a subsystem of dimension m less-than-or-equal-to n is conjectured to be S(m, n) = SIGMA(k = n + 1)mn 1/k - m-1/2n and is shown to be congruent-to ln m - m/2n for 1 textlesstextless m less-than-or-equal-to n. Thus there is less than one-half unit of information, on average, in the smaller subsystem of a total system in a random pure state.
BibTeX:
@article{page-average-1993,
  author = {Page, Don N.},
  title = {Average entropy of a subsystem},
  journal = {Physical Review Letters},
  year = {1993},
  volume = {71},
  number = {9},
  pages = {1291--1294}
}
Bub, J. Bananaworld: Quantum Mechanics for Primates 2016   book URL 
Abstract: The fascinating discoveries of the new fields of quantum information, quantum computation, and quantum cryptography are brought to life in this book in a way that is accessible and interesting to a wide range of readers, not just the experts. From a modern perspective, the characteristic feature of quantum mechanics is the existence of strangely counterintuitive correlations between distant events, which can be exploited in feats like quantum teleportation, unbreakable cryptographic schemes, and computers with enormously enhanced computing power. Schrödinger coined the term “entanglement” to describe these bizarre correlations, which show up in the random outcomes of different measurements on separated quantum systems. Bananaworld – an imaginary island with entangled bananas – is used to discuss sophisticated quantum phenomena without the mathematical machinery of quantum mechanics. As far as the conceptual problems of the theory that philosophers worry about are concerned, one might as well talk about bananas rather than quantum states. Nevertheless, the connection with quantum correlations is fully explained in sections written for the non-physicist reader with a serious interest in understanding the mysteries of the quantum world. The result is a subversive but entertaining book, with the novel thesis that quantum mechanics is about the structure of information, and what we have discovered is that the possibilities for representing, manipulating, and communicating information are different than we thought.
BibTeX:
@book{bub-bananaworld:-2016,
  author = {Bub, Jeffrey},
  title = {Bananaworld: Quantum Mechanics for Primates},
  publisher = {Oxford University Press},
  year = {2016},
  url = {http://www.oxfordscholarship.com/view/10.1093/acprof:oso/9780198718536.001.0001/acprof-9780198718536}
}
Earman, J. Bangs, crunches, whimpers, and shrieks: singularities and acausalities in relativistic spacetimes 1995   book  
BibTeX:
@book{earman-bangs-1995,
  author = {Earman, John},
  title = {Bangs, crunches, whimpers, and shrieks: singularities and acausalities in relativistic spacetimes},
  publisher = {Oxford University Press},
  year = {1995}
}
Rybakov, V.V. Barwise's information frames and modal logics 2002 Algebra and Logic
Vol. 41(5), pp. 323-336 
article  
BibTeX:
@article{rybakov-barwises-2002,
  author = {Rybakov, V. V.},
  title = {Barwise's information frames and modal logics},
  journal = {Algebra and Logic},
  year = {2002},
  volume = {41},
  number = {5},
  pages = {323--336}
}
Middleton, F.A. and Strick, P.L. Basal ganglia and cerebellar loops: motor and cognitive circuits 2000 Brain Res Brain Res Rev
Vol. 31 
article DOI URL 
BibTeX:
@article{middleton-basal-2000,
  author = {Middleton, F. A. and Strick, P. L.},
  title = {Basal ganglia and cerebellar loops: motor and cognitive circuits},
  journal = {Brain Res Brain Res Rev},
  year = {2000},
  volume = {31},
  url = {http://dx.doi.org/10.1016/S0165-0173(99)00040-5},
  doi = {http://doi.org/10.1016/S0165-0173(99)00040-5}
}
Alexander, G.E., Crutcher, M.D. and DeLong, M.R. Basal ganglia-thalamocortical circuits: parallel substrates for motor, oculomotor, “prefrontal” and “limbic” functions 1990 Prog Brain Res
Vol. 85 
article DOI URL 
BibTeX:
@article{alexander-basal-1990,
  author = {Alexander, G. E. and Crutcher, M. D. and DeLong, M. R.},
  title = {Basal ganglia-thalamocortical circuits: parallel substrates for motor, oculomotor, “prefrontal” and “limbic” functions},
  journal = {Prog Brain Res},
  year = {1990},
  volume = {85},
  url = {http://dx.doi.org/10.1016/S0079-6123(08)62678-3},
  doi = {http://doi.org/10.1016/S0079-6123(08)62678-3}
}
Talbott, W. Bayesian Epistemology 2016 The Stanford Encyclopedia of Philosophy  incollection URL 
BibTeX:
@incollection{talbott-bayesian-2016,
  author = {Talbott, William},
  title = {Bayesian Epistemology},
  booktitle = {The Stanford Encyclopedia of Philosophy},
  publisher = {Metaphysics Research Lab, Stanford University},
  year = {2016},
  edition = {Winter 2016},
  url = {https://plato.stanford.edu/archives/win2016/entries/epistemology-bayesian/}
}
Metzinger, T. Being No One The Self-Model Theory of Subjectivity 2003   book  
BibTeX:
@book{metzinger-being-2003,
  author = {Metzinger, T.},
  title = {Being No One The Self-Model Theory of Subjectivity},
  publisher = {MIT Press},
  year = {2003}
}
Hohwy, J. and Kallestrup, J. Being reduced: new essays on reduction, explanation, and causation 2008   book  
BibTeX:
@book{hohwy-being-2008,
  author = {Hohwy, Jakob and Kallestrup, Jesper},
  title = {Being reduced: new essays on reduction, explanation, and causation},
  publisher = {Oxford University Press},
  year = {2008}
}
Taylor, K.A. Belief, Information and Semantic Content: A Naturalist's Lament 1987 Synthese
Vol. 71(1), pp. 97-124 
article  
BibTeX:
@article{taylor-belief-1987,
  author = {Taylor, Kenneth A.},
  title = {Belief, Information and Semantic Content: A Naturalist's Lament},
  journal = {Synthese},
  year = {1987},
  volume = {71},
  number = {1},
  pages = {97--124}
}
Esfeld, M. Bell's Theorem and the Issue of Determinism and Indeterminism 2015 FOUNDATIONS OF PHYSICS
Vol. 45(5), pp. 471-482 
article  
Abstract: The paper considers the claim that quantum theories with a deterministic dynamics of objects in ordinary space-time, such as Bohmian mechanics, contradict the assumption that the measurement settings can be freely chosen in the EPR experiment. That assumption is one of the premises of Bell's theorem. I first argue that only a premise to the effect that what determines the choice of the measurement settings is independent of what determines the past state of the measured system is needed for the derivation of Bell's theorem. Determinism as such does not undermine that independence (unless there are particular initial conditions of the universe that would amount to conspiracy). Only entanglement could do so. However, generic entanglement without collapse on the level of the universal wave-function can go together with effective wave-functions for subsystems of the universe, as in Bohmian mechanics. The paper argues that such effective wave-functions are sufficient for the mentioned independence premise to hold.
BibTeX:
@article{esfeld-bells-2015,
  author = {Esfeld, M.},
  title = {Bell's Theorem and the Issue of Determinism and Indeterminism},
  journal = {FOUNDATIONS OF PHYSICS},
  year = {2015},
  volume = {45},
  number = {5},
  pages = {471--482}
}
Clauser, J.F. and Shimony, A. Bell's theorem. Experimental tests and implications 1978 Reports on Progress in Physics
Vol. 41(12), pp. 1881-1927 
article  
BibTeX:
@article{clauser-bells-1978,
  author = {Clauser, J. F. and Shimony, A.},
  title = {Bell's theorem. Experimental tests and implications},
  journal = {Reports on Progress in Physics},
  year = {1978},
  volume = {41},
  number = {12},
  pages = {1881--1927}
}
Doppelt, G. Best theory scientific realism 2014 European Journal for Philosophy of Science
Vol. 4(2), pp. 271-291 
article  
Abstract: The aim of this essay is to argue for a new version of 'inference-to-the-best-explanation' scientific realism, which I characterize as Best Theory Realism or 'BTR'. On BTR, the realist needs only to embrace a commitment to the truth or approximate truth of the best theories in a field, those which are unique in satisfying the highest standards of empirical success in a mature field with many successful but falsified predecessors. I argue that taking our best theories to be true is justified because it provides the best explanation of (1) the predictive success of their predecessors and (2) their own special success. Against standard and especially structural realism, I argue against the claim that the best explanations of the success of theories is provided by identifying their true components, such as structural relations between unobservable, which are preserved across theory change. In particular, I criticize Ladyman's and Carrier's structural account of the success of phlogiston theory, and Worrall's well-known structural account of the success of Fresnel's theory of light. I argue that these accounts tacitly assume the truth of our best theories, which in any case provides a better explanation of these theories' success than the structural account. Structural realism is now defended as the only version of realism that is able to surmount the pessimistic meta-induction and the general problem that successful theories involve ontological claims concerning unobservable entities that are abandoned and falsified in theory-change. I argue that Best Theory Realism can overcome the pessimistic meta-induction and this general problem posed by theory-change. Our best theories possess a characteristic which sharply distinguishes them from their successful but false predecessors. Furthermore 'inference-to-the-best-explanation' confirmation can establish the truth of our best theories and thus trumps the pessimistic inductive reasoning which is supposed to show that even our best theories are most likely false in their claims concerning unobservable entities and processes.; The aim of this essay is to argue for a new version of ‘inference-to-the-best-explanation’ scientific realism, which I characterize as Best Theory Realism or ‘BTR’. On BTR, the realist needs only to embrace a commitment to the truth or approximate truth of the best theories in a field, those which are unique in satisfying the highest standards of empirical success in a mature field with many successful but falsified predecessors. I argue that taking our best theories to be true is justified because it provides the best explanation of (1) the predictive success of their predecessors and (2) their own special success. Against standard and especially structural realism, I argue against the claim that the best explanations of the success of theories is provided by identifying their true components, such as structural relations between unobservable, which are preserved across theory change. In particular, I criticize Ladyman's and Carrier’s structural account of the success of phlogiston theory, and Worrall's well-known structural account of the success of Fresnel’s theory of light. I argue that these accounts tacitly assume the truth of our best theories, which in any case provides a better explanation of these theories’ success than the structural account. Structural realism is now defended as the only version of realism that is able to surmount the pessimistic meta-induction and the general problem that successful theories involve ontological claims concerning unobservable entities that are abandoned and falsified in theory-change. I argue that Best Theory Realism can overcome the pessimistic meta-induction and this general problem posed by theory-change. Our best theories possess a characteristic which sharply distinguishes them from their successful but false predecessors. Furthermore ‘inference-to-the-best-explanation’ confirmation can establish the truth of our best theories and thus trumps the pessimistic inductive reasoning which is supposed to show that even our best theories are most likely false in their claims concerning unobservable entities and processes.
BibTeX:
@article{doppelt-best-2014,
  author = {Doppelt, Gerald},
  title = {Best theory scientific realism},
  journal = {European Journal for Philosophy of Science},
  year = {2014},
  volume = {4},
  number = {2},
  pages = {271--291}
}
Braunstein, S.L., Pirandola, S. and Zyczkowski, K. Better late than never: Information retrieval from black holes 2013 Physical Review Letters
Vol. 110(10), pp. 101301 
article  
Abstract: We show that, in order to preserve the equivalence principle until late times in unitarily evaporating black holes, the thermodynamic entropy of a black hole must be primarily entropy of entanglement across the event horizon. For such black holes, we show that the information entering a black hole becomes encoded in correlations within a tripartite quantum state, the quantum analogue of a one-time pad, and is only decoded into the outgoing radiation very late in the evaporation. This behavior generically describes the unitary evaporation of highly entangled black holes and requires no specially designed evolution. Our work suggests the existence of a matter-field sum rule for any fundamental theory. DOI: 10.1103/PhysRevLett.110.101301; We show that, in order to preserve the equivalence principle until late times in unitarily evaporating black holes, the thermodynamic entropy of a black hole must be primarily entropy of entanglement across the event horizon. For such black holes, we show that the information entering a black hole becomes encoded in correlations within a tripartite quantum state, the quantum analogue of a one-time pad, and is only decoded into the outgoing radiation very late in the evaporation. This behavior generically describes the unitary evaporation of highly entangled black holes and requires no specially designed evolution. Our work suggests the existence of a matter-field sum rule for any fundamental theory.
BibTeX:
@article{braunstein-better-2013,
  author = {Braunstein, Samuel L. and Pirandola, Stefano and Zyczkowski, Karol},
  title = {Better late than never: Information retrieval from black holes},
  journal = {Physical Review Letters},
  year = {2013},
  volume = {110},
  number = {10},
  pages = {101301}
}
Atmanspacher, H. and Bishop, R. Between Chance and Choice: Interdisciplinry Perspectives on Determinism 2014   book  
BibTeX:
@book{atmanspacher-between-2014,
  author = {Atmanspacher, Harald and Bishop, Robert},
  title = {Between Chance and Choice: Interdisciplinry Perspectives on Determinism},
  publisher = {Andrews UK},
  year = {2014},
  edition = {2nd}
}
Adriaans, P. Between order and chaos: The quest for meaningful information 2009 Theory of Computing Systems
Vol. 45(4), pp. 650-674 
article  
Abstract: The notion of meaningful information seems to be associated with the sweet spot between order and chaos. This form of meaningfulness of information, which is primarily what science is interested in, is not captured by both Shannon information and Kolmogorov complexity. In this paper I develop a theoretical framework that can be seen as a first approximation to a study of meaningful information. In this context I introduce the notion of facticity of a data set. I discuss the relation between thermodynamics and algorithmic complexity theory in the context of this problem. I prove that, under adequate measurement conditions, the free energy of a system in the world is associated with the randomness deficiency of a data set with observations about this system. These insights suggest an explanation of the efficiency of human intelligence in terms of helpful distributions. Finally I give a critical discussion of Schmidhuber’s views specifically his notion of low complexity art, I defend the view that artists optimize facticity instead.; The notion of meaningful information seems to be associated with the sweet spot between order and chaos. This form of meaningfulness of information, which is primarily what science is interested in, is not captured by both Shannon information and Kolmogorov complexity. In this paper I develop a theoretical framework that can be seen as a first approximation to a study of meaningful information. In this context I introduce the notion of facticity of a data set. I discuss the relation between thermodynamics and algorithmic complexity theory in the context of this problem. I prove that, under adequate measurement conditions, the free energy of a system in the world is associated with the randomness deficiency of a data set with observations about this system. These insights suggest an explanation of the efficiency of human intelligence in terms of helpful distributions. Finally I give a critical discussion of Schmidhuber's views specifically his notion of low complexity art, I defend the view that artists optimize facticity instead. [PUBLICATION ABSTRACT]; The notion of meaningful information seems to be associated with the sweet spot between order and chaos. This form of meaningfulness of information, which is primarily what science is interested in, is not captured by both Shannon information and Kolmogorov complexity. In this paper I develop a theoretical framework that can be seen as a first approximation to a study of meaningful information. In this context I introduce the notion of facticity of a data set. I discuss the relation between thermodynamics and algorithmic complexity theory in the context of this problem. I prove that, under adequate measurement conditions, the free energy of a system in the world is associated with the randomness deficiency of a data set with observations about this system. These insights suggest an explanation of the efficiency of human intelligence in terms of helpful distributions. Finally I give a critical discussion of Schmidhuber's views specifically his notion of low complexity art, I defend the view that artists optimize facticity instead.; The notion of meaningful information seems to be associated with the sweet spot between order and chaos. This form of meaningfulness of information, which is primarily what science is interested in, is not captured by both Shannon information and Kolmogorov complexity. In this paper I develop a theoretical framework that can be seen as a first approximation to a study of meaningful information. In this context I introduce the notion of facticity of a data set. I discuss the relation between thermodynamics and algorithmic complexity theory in the context of this problem. I prove that, under adequate measurement conditions, the free energy of a system in the world is associated with the randomness deficiency of a data set with observations about this system. These insights suggest an explanation of the efficiency of human intelligence in terms of helpful distributions. Finally I give a critical discussion of Schmidhuber's views specifically his notion of low complexity art, I defend the view that artists optimize facticity instead.; The notion of meaningful information seems to be associated with the sweet spot between order and chaos. This form of meaningfulness of information, which is primarily what science is interested in, is not captured by both Shannon information and Kolmogorov complexity. In this paper I develop a theoretical framework that can be seen as a first approximation to a study of meaningful information. In this context I introduce the notion of facticity of a data set. I discuss the relation between thermodynamics and algorithmic complexity theory in the context of this problem. I prove that, under adequate measurement conditions, the free energy of a system in the world is associated with the randomness deficiency of a data set with observations about this system. These insights suggest an explanation of the efficiency of human intelligence in terms of helpful distributions. Finally I give a critical discussion of Schmidhuber's views specifically his notion of low complexity art, I defend the view that artists optimize facticity instead. [PUBLICATION ABSTRACT]
BibTeX:
@article{adriaans-between-2009,
  author = {Adriaans, P.},
  title = {Between order and chaos: The quest for meaningful information},
  journal = {Theory of Computing Systems},
  year = {2009},
  volume = {45},
  number = {4},
  pages = {650--674}
}
Mari, L. Beyond the representational viewpoint: a new formalization of measurement 2000 Measurement
Vol. 27(2), pp. 71-84 
article  
BibTeX:
@article{mari-beyond-2000,
  author = {Mari, Luca},
  title = {Beyond the representational viewpoint: a new formalization of measurement},
  journal = {Measurement},
  year = {2000},
  volume = {27},
  number = {2},
  pages = {71--84}
}
Gupta, A., Condit, C. and Qian, X. BioDB: An ontology-enhanced information system for heterogeneous biological information 2010 Data & Knowledge Engineering
Vol. 69(11), pp. 1084-1102 
article  
Abstract: This paper presents BIODB, an ontology-enhanced information system to manage heterogeneous data. An ontology-enhanced system is a system where ad hoc data is imported into the system by a user, annotated by the user to connect the data to an ontology or other data sources, and then all data connected through the ontology can be queried in a federated manner. The BIODB system enables multi-model data federation. i.e., it federate data that can be in different data models including, relational. XML and RDF, sequence data and so on. It uses an ontologically enhanced system catalog, an ontological data index, an association index to facilitate cross-model data mapping, and a new algorithm for ontology-assisted keyword queries with ranking. The paper describes these components in detail, and presents an evaluation of the architecture in the context of an actual application. (C) 2010 Elsevier B.V. All rights reserved.; This paper presents BIODB, an ontology-enhanced information system to manage heterogeneous data. An ontology-enhanced system is a system where ad hoc data is imported into the system by a user, annotated by the user to connect the data to an ontology or other data sources, and then all data connected through the ontology can be queried in a federated manner. The BIODB system enables multi-model data federation, i.e., it federate data that can be in different data models including, relational, XML and RDF, sequence data and so on. It uses an ontologically enhanced system catalog, an ontological data index, an association index to facilitate cross-model data mapping, and a new algorithm for ontology-assisted keyword queries with ranking. The paper describes these components in detail, and presents an evaluation of the architecture in the context of an actual application.; This paper presents BIODB, an ontology-enhanced information system to manage heterogeneous data. An ontology-enhanced system is a system where ad hoc data is imported into the system by a user, annotated by the user to connect the data to an ontology or other data sources, and then all data connected through the ontology can be queried in a federated manner. The BIODB system enables multi-model data federation, i.e., it federate data that can be in different data models including, relational, XML and RDF, sequence data and so on. It uses an ontologically enhanced system catalog, an ontological data index, an association index to facilitate cross-model data mapping, and a new algorithm for ontology-assisted keyword queries with ranking. The paper describes these components in detail, and presents an evaluation of the architecture in the context of an actual application. © 2010 Elsevier B.V. All rights reserved.
BibTeX:
@article{gupta-biodb:-2010,
  author = {Gupta, Amarnath and Condit, Christopher and Qian, Xufei},
  title = {BioDB: An ontology-enhanced information system for heterogeneous biological information},
  journal = {Data & Knowledge Engineering},
  year = {2010},
  volume = {69},
  number = {11},
  pages = {1084--1102}
}
Artmann, S. Biological Information 2008 , pp. 22-39  incollection  
BibTeX:
@incollection{artmann-biological-2008,
  author = {Artmann, Stefan},
  title = {Biological Information},
  year = {2008},
  pages = {22--39}
}
Galas, D.J., Nykter, M., Carter, G.W., Price, N.D. and Shmulevich, I. Biological Information as Set-Based Complexity 2010 IEEE Transactions on Information Theory
Vol. 56(2), pp. 667-677 
article  
Abstract: The significant and meaningful fraction of all the potential information residing in the molecules and structures of living systems is unknown. Sets of random molecular sequences or identically repeated sequences, for example, would be expected to contribute little or no useful information to a cell. This issue of quantitation of information is important since the ebb and flow of biologically significant information is essential to our quantitative understanding of biological function and evolution. Motivated specifically by these problems of biological information, a class of measures is proposed to quantify the contextual nature of the information in sets of objects, based on Kolmogorov's intrinsic complexity. Such measures discount both random and redundant information and are inherent in that they do not require a defined state space to quantify the information. The maximization of this new measure, which can be formulated in terms of the universal information distance, appears to have several useful and interesting properties, some of which we illustrate with examples.; The significant and meaningful fraction of all the potential information residing in the molecules and structures of living systems is unknown. Sets of random molecular sequences or identically repeated sequences, for example, would be expected to contribute little or no useful information to a cell. This issue of quantitation of information is important since the ebb and flow of biologically significant information is essential to our quantitative understanding of biological function and evolution. Motivated specifically by these problems of biological information, a class of measures is proposed to quantify the contextual nature of the information in sets of objects, based on Kolmogorov's intrinsic complexity. Such measures discount both random and redundant information and are inherent in that they do not require a defined state space to quantify the information. The maximization of this new measure, which can be formulated in terms of the universal information distance, appears to have several useful and interesting properties, some of which we illustrate with examples. [PUBLICATION ABSTRACT]; The significant and meaningful fraction of all the potential information residing in the molecules and structures of living systems is unknown. Sets of random molecular sequences or identically repeated sequences, for example, would be expected to contribute little or no useful information to a cell. This issue of quantitation of information is important since the ebb and flow of biologically significant information is essential to our quantitative understanding of biological function and evolution. Motivated specifically by these problems of biological information, a class of measures is proposed to quantify the contextual nature of the information in sets of objects, based on Kolmogorov's intrinsic complexity. Such measures discount both random and redundant information and are inherent in that they do not require a defined state space to quantify the information. The maximization of this new measure, which can be formulated in terms of the universal information distance, appears to have several useful and interesting properties, some of which we illustrate with examples.; The significant and meaningful fraction of all the potential information residing in the molecules and structures of living systems is unknown. Sets of random molecular sequences or identically repeated sequences, for example, would be expected to contribute little or no useful information to a cell. This issue of quantitation of information is important since the ebb and flow of biologically significant information is essential to our quantitative understanding of biological function and evolution. Motivated specifically by these problems of biological information, a class of measures is proposed to quantify the contextual nature of the information in sets of objects, based on Kolmogorov's intrinsic complexity. Such measures discount both random and redundant information and are inherent in that they do not require a defined state space to quantify the information. The maximization of this new measure, which can be formulated in terms of the universal information distance, appears to have several useful and interesting properties, some of which we illustrate with examples. [PUBLICATION ABSTRACT]
BibTeX:
@article{galas-biological-2010,
  author = {Galas, D. J. and Nykter, M. and Carter, G. W. and Price, N. D. and Shmulevich, I.},
  title = {Biological Information as Set-Based Complexity},
  journal = {IEEE Transactions on Information Theory},
  year = {2010},
  volume = {56},
  number = {2},
  pages = {667--677}
}
Sarkar, S. Biological Information: A Skeptical Look at Some Central Dogmas of Molecular Biology 1996 Molecular Models of Life: Philosophical Papers on Molecular Biology, pp. 205-260  incollection  
BibTeX:
@incollection{sarkar-biological-1996,
  author = {Sarkar, Sahotra},
  title = {Biological Information: A Skeptical Look at Some Central Dogmas of Molecular Biology},
  booktitle = {Molecular Models of Life: Philosophical Papers on Molecular Biology},
  publisher = {MIT Press},
  year = {1996},
  pages = {205--260}
}
Akay, M. Biomedical signal processing 2012   book  
BibTeX:
@book{akay-biomedical-2012,
  author = {Akay, Metin},
  title = {Biomedical signal processing},
  publisher = {Elsevier Science},
  year = {2012}
}
Millikan, R.G. Biosemantics 1989 The Journal of Philosophy
Vol. 86(6), pp. 281-297 
article  
BibTeX:
@article{millikan-biosemantics-1989,
  author = {Millikan, Ruth G.},
  title = {Biosemantics},
  journal = {The Journal of Philosophy},
  year = {1989},
  volume = {86},
  number = {6},
  pages = {281--297}
}
Millikan, R.G. Biosemantics 2009
Vol. 1, pp. 394-407 
incollection  
Abstract: Introduction – Mental Causation – The Causal Closure of the Physical and Naturalism – Dualism – Epiphenomenalism – Anomalous Monism – Non-reductive Materialism – Functionalism – What is Property Physicalism? – What is the Physical? – Idealism – Panpsychism – Subjectivity – Higher-order Theories of Consciousness – Representationalist Theories of Consciousness – Sensory Qualities, Sensible Qualities, Sensational Qualities – The Explanatory Gap – Phenomenal Concepts – The Two-dimensional Argument Against Materialism – Intentional Systems Theory – Wide Content – Narrow Content – Information-theoretic Semantics – Biosemantics – A Measurement-theoretic Account of Propositional Attitudes – The Normativity of the Intentional – Concepts and Possession Conditions – The Distinction Between Conceptual and Nonconceptual Content – Intentionalism – The Content of Perceptual Experience – Phenomenology, Intentionality, and the Unity of the Mind – The Self – Unity of Consciousness – Personal Identity and Metaphysics – Imagination – Thinking – Language and Thought – Consciousness and Reference – Memory – Emotions: Motivating Feelings – Intention and Intentional Action – Folk Psychology – Other Minds – Introspection – Semantic Externalism and Self-knowledge – Self-Deception; The term ‘biosemantics’ has usually been applied only to the theory of mental representation. This article first characterizes a more general class of theories called ‘teleological theories of mental content’ of which biosemantics is an example. Then it discusses the details that distinguish biosemantics from other naturalistic teleological theories. Naturalistic theories of mental representation attempt to explain, in terms designed to fit within the natural sciences, what it is about a mental representation that makes it represent something. Frequently these theories have been classified as either picture theories, causal or covariation theories, information theories, functionalist or causal-role theories, or teleological theories, the assumption being that these various categories are side by side with one another.
BibTeX:
@incollection{millikan-biosemantics-2009,
  author = {Millikan, Ruth G.},
  title = {Biosemantics},
  publisher = {Oxford University Press},
  year = {2009},
  volume = {1},
  pages = {394--407}
}
Barbieri, M. Biosemiotics: a new understanding of life 2008 Naturwissenschaften
Vol. 95(7), pp. 577-599 
article  
Abstract: Biosemiotics is the idea that life is based on semiosis, i.e., on signs and codes. This idea has been strongly suggested by the discovery of the genetic code, but so far it has made little impact in the scientific world and is largely regarded as a philosophy rather than a science. The main reason for this is that modern biology assumes that signs and meanings do not exist at the molecular level, and that the genetic code was not followed by any other organic code for almost four billion years, which implies that it was an utterly isolated exception in the history of life. These ideas have effectively ruled out the existence of semiosis in the organic world, and yet there are experimental facts against all of them. If we look at the evidence of life without the preconditions of the present paradigm, we discover that semiosis is there, in every single cell, and that it has been there since the very beginning. This is what biosemiotics is really about. It is not a philosophy. It is a new scientific paradigm that is rigorously based on experimental facts. Biosemiotics claims that the genetic code (1) is a real code and (2) has been the first of a long series of organic codes that have shaped the history of life on our planet. The reality of the genetic code and the existence of other organic codes imply that life is based on two fundamental processes–copying and coding–and this in turn implies that evolution took place by two distinct mechanisms, i.e., by natural selection (based on copying) and by natural conventions (based on coding). It also implies that the copying of genes works on individual molecules, whereas the coding of proteins operates on collections of molecules, which means that different mechanisms of evolution exist at different levels of organization. This review intends to underline the scientific nature of biosemiotics, and to this purpose, it aims to prove (1) that the cell is a real semiotic system, (2) that the genetic code is a real code, (3) that evolution took place by natural selection and by natural conventions, and (4) that it was natural conventions, i.e., organic codes, that gave origin to the great novelties of macroevolution. Biological semiosis, in other words, is a scientific reality because the codes of life are experimental realities. The time has come, therefore, to acknowledge this fact of life, even if that means abandoning the present theoretical framework in favor of a more general one where biology and semiotics finally come together and become biosemiotics.;Biosemiotics is the idea that life is based on semiosis, i.e., on signs and codes. This idea has been strongly suggested by the discovery of the genetic code, but so far it has made little impact in the scientific world and is largely regarded as a philosophy rather than a science. The main reason for this is that modern biology assumes that signs and meanings do not exist at the molecular level, and that the genetic code was not followed by any other organic code for almost four billion years, which implies that it was an utterly isolated exception in the history of life. These ideas have effectively ruled out the existence of semiosis in the organic world, and yet there are experimental facts against all of them. If we look at the evidence of life without the preconditions of the present paradigm, we discover that semiosis is there, in every single cell, and that it has been there since the very beginning. This is what biosemiotics is really about. It is not a philosophy. It is a new scientific paradigm that is rigorously based on experimental facts. Biosemiotics claims that the genetic code (1) is a real code and (2) has been the first of a long series of organic codes that have shaped the history of life on our planet. The reality of the genetic code and the existence of other organic codes imply that life is based on two fundamental processes—copying and coding—and this in turn implies that evolution took place by two distinct mechanisms, i.e., by natural selection (based on copying) and by natural conventions (based on coding). It also implies that the copying of genes works on individual molecules, whereas the coding of proteins operates on collections of molecules, which means that different mechanisms of evolution exist at different levels of organization. This review intends to underline the scientific nature of biosemiotics, and to this purpose, it aims to prove (1) that the cell is a real semiotic system, (2) that the genetic code is a real code, (3) that evolution took place by natural selection and by natural conventions, and (4) that it was natural conventions, i.e., organic codes, that gave origin to the great novelties of macroevolution. Biological semiosis, in other words, is a scientific reality because the codes of life are experimental realities. The time has come, therefore, to acknowledge this fact of life, even if that means abandoning the present theoretical framework in favor of a more general one where biology and semiotics finally come together and become biosemiotics.;Biosemiotics is the idea that life is based on semiosis, i.e., on signs and codes. This idea has been strongly suggested by the discovery of the genetic code, but so far it has made little impact in the scientific world and is largely regarded as a philosophy rather than a science. The main reason for this is that modern biology assumes that signs and meanings do not exist at the molecular level, and that the genetic code was not followed by any other organic code for almost four billion years, which implies that it was an utterly isolated exception in the history of life. These ideas have effectively ruled out the existence of semiosis in the organic world, and yet there are experimental facts against all of them. If we look at the evidence of life without the preconditions of the present paradigm, we discover that semiosis is there, in every single cell, and that it has been there since the very beginning. This is what biosemiotics is really about. It is not a philosophy. It is a new scientific paradigm that is rigorously based on experimental facts. Biosemiotics claims that the genetic code (1) is a real code and (2) has been the first of a long series of organic codes that have shaped the history of life on our planet. The reality of the genetic code and the existence of other organic codes imply that life is based on two fundamental processes–copying and coding–and this in turn implies that evolution took place by two distinct mechanisms, i.e., by natural selection (based on copying) and by natural conventions (based on coding). It also implies that the copying of genes works on individual molecules, whereas the coding of proteins operates on collections of molecules, which means that different mechanisms of evolution exist at different levels of organization. This review intends to underline the scientific nature of biosemiotics, and to this purpose, it aims to prove (1) that the cell is a real semiotic system, (2) that the genetic code is a real code, (3) that evolution took place by natural selection and by natural conventions, and (4) that it was natural conventions, i.e., organic codes, that gave origin to the great novelties of macroevolution. Biological semiosis, in other words, is a scientific reality because the codes of life are experimental realities. The time has come, therefore, to acknowledge this fact of life, even if that means abandoning the present theoretical framework in favor of a more general one where biology and semiotics finally come together and become biosemiotics.;
BibTeX:
@article{barbieri-biosemiotics:-2008,
  author = {Barbieri, Marcello},
  title = {Biosemiotics: a new understanding of life},
  journal = {Naturwissenschaften},
  year = {2008},
  volume = {95},
  number = {7},
  pages = {577--599}
}
Barbour, J. Bit from it 2015 It From Bit or Bit From It?, pp. 197-211  incollection  
BibTeX:
@incollection{barbour-bit-2015,
  author = {Barbour, Julian},
  title = {Bit from it},
  booktitle = {It From Bit or Bit From It?},
  publisher = {Springer},
  year = {2015},
  pages = {197--211}
}
Verlinde, E. and Verlinde, H. Black hole entanglement and quantum error correction 2013 Journal of High Energy Physics
Vol. 2013(10), pp. 1-34 
article  
Abstract: It was recently argued in [1] that black hole complementarity strains the basic rules of quantum information theory, such as monogamy of entanglement. Motivated by this argument, we develop a practical framework for describing black hole evaporation via unitary time evolution, based on a holographic perspective in which all black hole degrees of freedom live on the stretched horizon. We model the horizon as a unitary quantum system with finite entropy, and do not postulate that the horizon geometry is smooth. We then show that, with mild assumptions, one can reconstruct local effective field theory observables that probe the black hole interior, and relative to which the state near the horizon looks like a local Minkowski vacuum. There construction makes use of the formalism of quantum error correcting codes, and works for black hole states who seen tanglement entropy does not yet saturate the Bekenstein-Hawking bound. Our general framework clarifies the black hole final state proposal, and allows a quantitative study of the transition into the "firewall" regime of maximally mixed black hole states.;It was recently argued in [1] that black hole complementarity strains the basic rules of quantum information theory, such as monogamy of entanglement. Motivated by this argument, we develop a practical framework for describing black hole evaporation via unitary time evolution, based on a holographic perspective in which all black hole degrees of freedom live on the stretched horizon. We model the horizon as a unitary quantum system with finite entropy, and do not postulate that the horizon geometry is smooth. We then show that, with mild assumptions, one can reconstruct local effective field theory observables that probe the black hole interior, and relative to which the state near the horizon looks like a local Minkowski vacuum. The reconstruction makes use of the formalism of quantum error correcting codes, and works for black hole states whose entanglement entropy does not yet saturate the Bekenstein-Hawking bound. Our general framework clarifies the black hole final state proposal, and allows a quantitative study of the transition into the “firewall” regime of maximally mixed black hole states.;It was recently argued in [1] that black hole complementarity strains the basic rules of quantum information theory, such as monogamy of entanglement. Motivated by this argument, we develop a practical framework for describing black hole evaporation via unitary time evolution, based on a holographic perspective in which all black hole degrees of freedom live on the stretched horizon. We model the horizon as a unitary quantum system with finite entropy, and do not postulate that the horizon geometry is smooth. We then show that, with mild assumptions, one can reconstruct local effective field theory observables that probe the black hole interior, and relative to which the state near the horizon looks like a local Minkowski vacuum. The reconstruction makes use of the formalism of quantum error correcting codes, and works for black hole states whose entanglement entropy does not yet saturate the Bekenstein-Hawking bound. Our general framework clarifies the black hole final state proposal, and allows a quantitative study of the transition into the "firewall" regime of maximally mixed black hole states.;
BibTeX:
@article{verlinde-black-2013,
  author = {Verlinde, Erik and Verlinde, Herman},
  title = {Black hole entanglement and quantum error correction},
  journal = {Journal of High Energy Physics},
  year = {2013},
  volume = {2013},
  number = {10},
  pages = {1--34}
}
Wald, R.M. Black hole entropy is the Noether charge 1993 Physical Review D
Vol. 48(8), pp. R3427-R3431 
article  
BibTeX:
@article{wald-black-1993,
  author = {Wald, Robert M.},
  title = {Black hole entropy is the Noether charge},
  journal = {Physical Review D},
  year = {1993},
  volume = {48},
  number = {8},
  pages = {R3427--R3431}
}
Smolin, L. Black hole information paradox and relative locality 2014 Physical Review D
Vol. 90(2) 
article  
BibTeX:
@article{smolin-black-2014,
  author = {Smolin, Lee},
  title = {Black hole information paradox and relative locality},
  journal = {Physical Review D},
  year = {2014},
  volume = {90},
  number = {2}
}
Joshi, P.S. and Narayan, R. Black Hole Paradoxes 2014   article  
Abstract: We propose here that the well-known black hole paradoxes such as the information loss and teleological nature of the event horizon are restricted to a particular idealized case, which is the homogeneous dust collapse model. In this case, the event horizon, which defines the boundary of the black hole, forms initially, and the singularity in the interior of the black hole at a later time. We show that, in contrast, gravitational collapse from physically more realistic initial conditions typically leads to the scenario in which the event horizon and space-time singularity form simultaneously. We point out that this apparently simple modification can mitigate the causality and teleological paradoxes, and also lends support to two recently suggested solutions to the information paradox, namely, the `firewall' and `classical chaos' proposals.
BibTeX:
@article{joshi-black-2014,
  author = {Joshi, Pankaj S. and Narayan, Ramesh},
  title = {Black Hole Paradoxes},
  year = {2014}
}
Hawking, S. Black holes and baby universes and other essays 1993   book  
BibTeX:
@book{hawking-black-1993,
  author = {Hawking, Stephen},
  title = {Black holes and baby universes and other essays},
  publisher = {Bantam Books},
  year = {1993}
}
Bekenstein, J.D. Black holes and information theory 2004 Contemporary Physics
Vol. 45(1), pp. 31-43 
article  
Abstract: During the past three decades investigators have unveiled a number of deep connections between physical information and black holes whose consequences for ordinary systems go beyond what has been deduced purely from the axioms of information theory. After a self-contained introduction to black hole thermodynamics, we review from its vantage point topics such as the information conundrum that emerges from the ability of incipient black holes to radiate, the various entropy bounds for non-black hole systems (holographic bound, universal entropy bound, etc.) which are most easily derived from black hole thermodynamics, Bousso's covariant entropy bound, the holographic principle of particle physics, and the subject of channel capacity of quantum communication channels.;During the past three decades investigators have unveiled a number of deep connections between physical information and black holes whose consequences for ordinary systems go beyond what has been deduced purely from the axioms of information theory. After a self-contained introduction to black hole thermodynamics, we review from its vantage point topics such as the information conundrum that emerges from the ability of incipient black holes to radiate, the various entropy bounds for non-black hole systems (holographic bound, universal entropy bound, etc.) which are most easily derived from black hole thermodynamics, Bousso's covariant entropy bound, the holographic principle of particle physics, and the subject of channel capacity of quantum communication channels. [PUBLICATION ABSTRACT];During the past three decades investigators have unveiled a number of deep connections between physical information and black holes whose consequences for ordinary systems go beyond what has been deduced purely from the axioms of information theory. After a self-contained introduction to black hole thermodynamics, we review from its vantage point topics such as the information conundrum that emerges from the ability of incipient black holes to radiate, the various entropy bounds for non-black hole systems (holographic bound, universal entropy bound, etc.) which are most easily derived from black hole thermodynamics, Bousso's covariant entropy bound, the holographic principle of particle physics, and the subject of channel capacity of quantum communication channels. [PUBLICATION ABSTRACT];During the past three decades investigators have unveiled a number of deep connections between physical information and black holes whose consequences for ordinary systems go beyond what has been deduced purely from the axioms of information theory. After a self-contained introduction to black hole thermodynamics, we review from its vantage point topics such as the information conundrum that emerges from the ability of incipient black holes to radiate, the various entropy bounds for non- black hole systems ( holographic bound, universal entropy bound, etc.) which are most easily derived from black hole thermodynamics, Bousso's covariant entropy bound, the holographic principle of particle physics, and the subject of channel capacity of quantum communication channels.;
BibTeX:
@article{bekenstein-black-2004,
  author = {Bekenstein, Jacob D.},
  title = {Black holes and information theory},
  journal = {Contemporary Physics},
  year = {2004},
  volume = {45},
  number = {1},
  pages = {31--43}
}
Barbón, J.L.F. Black holes, information and holography 2009 Journal of Physics: Conference Series
Vol. 171, pp. 012009 
article  
BibTeX:
@article{barbon-black-2009,
  author = {Barbón, J. L. F.},
  title = {Black holes, information and holography},
  journal = {Journal of Physics: Conference Series},
  year = {2009},
  volume = {171},
  pages = {012009}
}
Almheiri, A., Marolf, D., Polchinski, J. and Sully, J. Black holes: complementarity or firewalls? 2013 Journal of High Energy Physics
Vol. 2013(2), pp. 1-20 
article  
Abstract: We argue that the following three statements cannot all be true: (i) Hawking radiation is in a pure state, (ii) the information carried by the radiation is emitted from the region near the horizon, with low energy effective field theory valid beyond some microscopic distance from the horizon, and (iii) the infalling observer encounters nothing unusual at the horizon. Perhaps the most conservative resolution is that the infalling observer burns up at the horizon. Alternatives would seem to require novel dynamics that nevertheless cause notable violations of semiclassical physics at macroscopic distances from the horizon.; We argue that the following three statements cannot all be true: (i) Hawking radiation is in a pure state, (ii) the information carried by the radiation is emitted from the region near the horizon, with low energy effective field theory valid beyond some microscopic distance from the horizon, and (iii) the infalling observer encounters nothing unusual at the horizon. Perhaps the most conservative resolution is that the infalling observer burns up at the horizon. Alternatives would seem to require novel dynamics that nevertheless cause notable violations of semiclassical physics at macroscopic distances from the horizon.; We argue that the following three statements cannot all be true: (i) Hawking radiation is in a pure state, (ii) the information carried by the radiation is emitted from the region near the horizon, with low energy effective field theory valid beyond some microscopic distance from the horizon, and (iii) the infalling observer encounters nothing unusual at the horizon. Perhaps the most conservative resolution is that the infalling observer burns up at the horizon. Alternatives would seem to require novel dynamics that nevertheless cause notable violations of semiclassical physics at macroscopic distances from the horizon.
BibTeX:
@article{almheiri-black-2013,
  author = {Almheiri, Ahmed and Marolf, Donald and Polchinski, Joseph and Sully, James},
  title = {Black holes: complementarity or firewalls?},
  journal = {Journal of High Energy Physics},
  year = {2013},
  volume = {2013},
  number = {2},
  pages = {1--20}
}
Weller, C. Bonjour and Mentalese 1997 Synthese
Vol. 113(2), pp. 251 
article  
BibTeX:
@article{weller-bonjour-1997,
  author = {Weller, Cass},
  title = {Bonjour and Mentalese},
  journal = {Synthese},
  year = {1997},
  volume = {113},
  number = {2},
  pages = {251}
}
PFAFF, D.W. Brain Arousal and Information Theory: Neural and Genetic Mechanisms 2006   book  
BibTeX:
@book{pfaff-brain-2006,
  author = {PFAFF, Donald W.},
  title = {Brain Arousal and Information Theory: Neural and Genetic Mechanisms},
  publisher = {Harvard University Press},
  year = {2006}
}
Laureys, S., Antoine, S., Boly, M., Elincx, S., Faymonville, M.E., Berre, J., Sadzot, B., Ferring, M., De Tiege, X., van Bogaert, P., Hansen, I., Damas, P., Mavroudakis, N., Lambermont, B., Del Fiore, G., Aerts, J., Degueldre, C., Phillips, C., Franck, G., Vincent, J.L., Lamy, M., Luxen, A., Moonen, G., Goldman, S. and Maquet, P. Brain function in the vegetative state 2002 Acta Neurol Belg
Vol. 102 
article  
BibTeX:
@article{laureys-brain-2002,
  author = {Laureys, S. and Antoine, S. and Boly, M. and Elincx, S. and Faymonville, M. E. and Berre, J. and Sadzot, B. and Ferring, M. and De Tiege, X. and van Bogaert, P. and Hansen, I. and Damas, P. and Mavroudakis, N. and Lambermont, B. and Del Fiore, G. and Aerts, J. and Degueldre, C. and Phillips, C. and Franck, G. and Vincent, J. L. and Lamy, M. and Luxen, A. and Moonen, G. and Goldman, S. and Maquet, P.},
  title = {Brain function in the vegetative state},
  journal = {Acta Neurol Belg},
  year = {2002},
  volume = {102}
}
Friston, K.J. Brain function, nonlinear coupling, and neuronal transients 2001 Neuroscientist
Vol. 7 
article DOI URL 
BibTeX:
@article{friston-brain-2001,
  author = {Friston, K. J.},
  title = {Brain function, nonlinear coupling, and neuronal transients},
  journal = {Neuroscientist},
  year = {2001},
  volume = {7},
  url = {http://dx.doi.org/10.1177/107385840100700510},
  doi = {http://doi.org/10.1177/107385840100700510}
}
Moruzzi, G. and Magoun, H.W. Brain stem reticular formation and activation of the EEG 1949 Electroencephalog Clin Neurophysiol
Vol. 1 
article DOI URL 
BibTeX:
@article{moruzzi-brain-1949,
  author = {Moruzzi, G. and Magoun, H. W.},
  title = {Brain stem reticular formation and activation of the EEG},
  journal = {Electroencephalog Clin Neurophysiol},
  year = {1949},
  volume = {1},
  url = {http://dx.doi.org/10.1016/0013-4694(49)90219-9},
  doi = {http://doi.org/10.1016/0013-4694(49)90219-9}
}
Libet, B. Brain stimulation in the study of neuronal functions for conscious sensory experiences 1982 Human Neurobiology
Vol. 1 
article  
BibTeX:
@article{libet-brain-1982,
  author = {Libet, B.},
  title = {Brain stimulation in the study of neuronal functions for conscious sensory experiences},
  journal = {Human Neurobiology},
  year = {1982},
  volume = {1}
}
Steriade, M. and McCarley, R.W. Brainstem Control of Wakefulness and Sleep 1990   book URL 
BibTeX:
@book{steriade-brainstem-1990,
  author = {Steriade, M. and McCarley, R. W.},
  title = {Brainstem Control of Wakefulness and Sleep},
  publisher = {Plenum Press},
  year = {1990},
  url = {http://dx.doi.org/10.1007/978-1-4757-4669-3}
}
Dennett, D.C. Brainstorms: philosophical essays on mind and psychology 1978
Vol. 8 
book  
BibTeX:
@book{dennett-brainstorms:-1978,
  author = {Dennett, D. C.},
  title = {Brainstorms: philosophical essays on mind and psychology},
  publisher = {Harvester Press},
  year = {1978},
  volume = {8},
  edition = {1st}
}
Li, F. Bureaucracy and the state in early China: governing the western Zhou 2008   book  
BibTeX:
@book{li-bureaucracy-2008,
  author = {Li, Feng},
  title = {Bureaucracy and the state in early China: governing the western Zhou},
  publisher = {Cambridge University Press},
  year = {2008}
}
Bennett, K. BY OUR BOOTSTRAPS 2011 Philosophical Perspectives
Vol. 25(1), pp. 27-41 
article  
BibTeX:
@article{bennett-by-2011,
  author = {Bennett, Karen},
  title = {BY OUR BOOTSTRAPS},
  journal = {Philosophical Perspectives},
  year = {2011},
  volume = {25},
  number = {1},
  pages = {27--41}
}
Melnyk, A. Can Metaphysics Be Naturalized? And If So, How? 2013   incollection  
Abstract: An exercise in metaphilosophy, this chapter addresses one aspect of the relationship between science and philosophy. Non-naturalized, analytic metaphysics has not yielded results at all comparable with those achieved by mathematics and logic. Metaphysics needs to be naturalized, but how can scientific findings be made relevant to metaphysics—as evidence, as sources of new problems, or in other ways? Must some traditional metaphysical problems be abandoned as intractable? What sort of problems might take their place? Answers to these questions arise from detailed criticism of the answers given in Ladyman, Ross, et al., Every Thing Must Go: Metaphysics Naturalized. Naturalized metaphysics requires outstanding questions that we want answered but that don’t fall within the province of the sciences; there look to be such questions, including that of how to unify science; but whether we can answer them is best determined by trying to do so.
BibTeX:
@incollection{melnyk-can-2013,
  author = {Melnyk, Andrew},
  title = {Can Metaphysics Be Naturalized? And If So, How?},
  publisher = {Oxford University Press},
  year = {2013}
}
Bohr, N. Can Quantum-Mechanical Description of Physical Reality be Considered Complete? 1935 Physical Review
Vol. 48(8), pp. 696-702 
article  
BibTeX:
@article{bohr-can-1935,
  author = {Bohr, N.},
  title = {Can Quantum-Mechanical Description of Physical Reality be Considered Complete?},
  journal = {Physical Review},
  year = {1935},
  volume = {48},
  number = {8},
  pages = {696--702}
}
Einstein, A., Podolsky, B. and Rosen, N. Can Quantum-Mechanical Description of Physical Reality Be Considered Complete? 1935 Physical Review
Vol. 47(10), pp. 777-780 
article  
BibTeX:
@article{einstein-can-1935,
  author = {Einstein, A. and Podolsky, B. and Rosen, N.},
  title = {Can Quantum-Mechanical Description of Physical Reality Be Considered Complete?},
  journal = {Physical Review},
  year = {1935},
  volume = {47},
  number = {10},
  pages = {777--780}
}
Brassard, G. and Méthot, A.A. Can Quantum-Mechanical Description of Physical Reality Be Considered Correct? 2010 Foundations of Physics
Vol. 40(4), pp. 463-468 
article  
Abstract: In an earlier paper written in loving memory of Asher Peres, we gave a critical analysis of the celebrated 1935 paper in which Einstein, Podolsky and Rosen (EPR) challenged the completeness of quantum mechanics. There, we had pointed out logical shortcomings in the EPR paper. Now, we raise additional questions concerning their suggested program to find a theory that would “provide a complete description of the physical reality”. In particular, we investigate the extent to which the EPR argumentation could have lead to the more dramatic conclusion that quantum mechanics is in fact incorrect. With this in mind, we propose a speculation, made necessary by a logical shortcoming in the EPR paper caused by the lack of a necessary condition for “elements of reality”, and surmise that an eventually complete theory would either be inconsistent with quantum mechanics, or would at least violate Heisenberg’s Uncertainty Principle.
BibTeX:
@article{brassard-can-2010,
  author = {Brassard, Gilles and Méthot, André A.},
  title = {Can Quantum-Mechanical Description of Physical Reality Be Considered Correct?},
  journal = {Foundations of Physics},
  year = {2010},
  volume = {40},
  number = {4},
  pages = {463--468}
}
Colyvan, M. Can the Eleatic Principle Be Justified? 1998 Canadian Journal of Philosophy
Vol. 28(3), pp. 313-335 
article  
BibTeX:
@article{colyvan-can-1998,
  author = {Colyvan, Mark},
  title = {Can the Eleatic Principle Be Justified?},
  journal = {Canadian Journal of Philosophy},
  year = {1998},
  volume = {28},
  number = {3},
  pages = {313--335}
}
Cao, T.Y. Can We Dissolve Physical Entities into Mathematical Structures? 2003 Synthese
Vol. 136(1), pp. 57-71 
article  
BibTeX:
@article{cao-can-2003,
  author = {Cao, Tian Y.},
  title = {Can We Dissolve Physical Entities into Mathematical Structures?},
  journal = {Synthese},
  year = {2003},
  volume = {136},
  number = {1},
  pages = {57--71}
}
Rolston, H. Care on Earth: generating informed concern 2010 Information and the Nature of Reality: From Physics to Metaphysics, pp. 205-246  incollection  
BibTeX:
@incollection{rolston-care-2010,
  author = {Rolston, Holmes},
  title = {Care on Earth: generating informed concern},
  booktitle = {Information and the Nature of Reality: From Physics to Metaphysics},
  publisher = {Cambridge University Press},
  year = {2010},
  pages = {205--246},
  note = {DOI: 10.1017/CBO9780511778759.011}
}
Schiemer, G. Carnap's Early Semantics 2013 Erkenntnis: An International Journal of Analytic Philosophy
Vol. 78(3), pp. 487 
article  
Abstract: This paper concerns Carnap's early contributions to formal semantics in his work on general axiomatics between 1928 and 1936. Its main focus is on whether he held a variable domain conception of models. I argue that interpreting Carnap's account in terms of a fixed domain approach fails to describe his premodern understanding of formal models. By drawing attention to the second part of Carnap's unpublished manuscript Untersuchungen zur allgemeinen Axiomatik, an alternative interpretation of the notions 'model', 'model extension' and 'submodel' in his theory of axiomatics is presented. Specifically, it is shown that Carnap's early model theory is based on a convention to simulate domain variation that is not identical but logically comparable to the modern account.[PUBLICATION ABSTRACT];This paper concerns Carnap's early contributions to formal semantics in his work on general axiomatics between 1928 and 1936. Its main focus is on whether he held a variable domain conception of models. I argue that interpreting Carnap's account in terms of a fixed domain approach fails to describe his premodern understanding of formal models. By drawing attention to the second part of Carnap's unpublished manuscript Untersuchungen zur allgemeinen Axiomatik, an alternative interpretation of the notions 'model', 'model extension' and 'submodel' in his theory of axiomatics is presented. Specifically, it is shown that Carnap's early model theory is based on a convention to simulate domain variation that is not identical but logically comparable to the modern account.;
BibTeX:
@article{schiemer-carnaps-2013,
  author = {Schiemer, Georg},
  title = {Carnap's Early Semantics},
  journal = {Erkenntnis: An International Journal of Analytic Philosophy},
  year = {2013},
  volume = {78},
  number = {3},
  pages = {487}
}
Glynn, L. Causal foundationalism, physical causation, and difference-making 2013 Synthese
Vol. 190(6), pp. 1017-1037 
article  
Abstract: An influential tradition in the philosophy of causation has it that all token causal facts are, or are reducible to, facts about difference-making. Challenges to this tradition have typically focused on pre-emption cases, in which a cause apparently fails to make a difference to its effect. However, a novel challenge to the difference-making approach has recently been issued by Alyssa Ney. Ney defends causal foundationalism, which she characterizes as the thesis that facts about difference-making depend upon facts about physical causation. She takes this to imply that causation is not fundamentally a matter of difference-making. In this paper, I defend the difference-making approach against Ney's argument. I also offer some positive reasons for thinking, pace Ney, that causation is fundamentally a matter of difference-making.[PUBLICATION ABSTRACT];An influential tradition in the philosophy of causation has it that all token causal facts are, or are reducible to, facts about difference-making. Challenges to this tradition have typically focused on pre-emption cases, in which a cause apparently fails to make a difference to its effect. However, a novel challenge to the difference-making approach has recently been issued by Alyssa Ney. Ney defends causal foundationalism, which she characterizes as the thesis that facts about difference-making depend upon facts about physical causation. She takes this to imply that causation is not fundamentally a matter of difference-making. In this paper, I defend the difference-making approach against Ney’s argument. I also offer some positive reasons for thinking, pace Ney, that causation is fundamentally a matter of difference-making.;An influential tradition in the philosophy of causation has it that all token causal facts are, or are reducible to, facts about difference-making. Challenges to this tradition have typically focused on pre-emption cases, in which a cause apparently fails to make a difference to its effect. However, a novel challenge to the difference-making approach has recently been issued by Alyssa Ney. Ney defends causal foundationalism, which she characterizes as the thesis that facts about difference-making depend upon facts about physical causation. She takes this to imply that causation is not fundamentally a matter of difference-making. In this paper, I defend the difference-making approach against Ney's argument. I also offer some positive reasons for thinking, pace Ney, that causation is fundamentally a matter of difference-making.;
BibTeX:
@article{glynn-causal-2013,
  author = {Glynn, Luke},
  title = {Causal foundationalism, physical causation, and difference-making},
  journal = {Synthese},
  year = {2013},
  volume = {190},
  number = {6},
  pages = {1017--1037}
}
Menzies, P. Causal Models, Token Causation, and Processes 2004 Philosophy of Science
Vol. 71(5), pp. 820-832 
article  
Abstract: Judea Pearl (2000) has recently advanced a theory of token causation using his structural equations approach. This paper examines some counterexamples to Pearl's theory, and argues that the theory can be modified in a natural way to overcome them.;Judea Pearl (2000) has recently advanced a theory of token causation using his structural equations approach. This paper examines some counterexamples to Pearl's theory, and argues that the theory can be modified in a natural way to overcome them. [PUBLICATION ABSTRACT];Judea Pearl ( ) has recently advanced a theory of token causation using his structural equations approach. This paper examines some counterexamples to Pearl’s theory, and argues that the theory can be modified in a natural way to overcome them.;
BibTeX:
@article{menzies-causal-2004,
  author = {Menzies, Peter},
  title = {Causal Models, Token Causation, and Processes},
  journal = {Philosophy of Science},
  year = {2004},
  volume = {71},
  number = {5},
  pages = {820--832}
}
Skyrms, B. Causal necessity: a pragmatic investigation of the necessity of laws 1980   book  
BibTeX:
@book{skyrms-causal-1980,
  author = {Skyrms, Brian},
  title = {Causal necessity: a pragmatic investigation of the necessity of laws},
  publisher = {Yale University Press},
  year = {1980}
}
Chakravartty, A. Causal Realism: Events and Processes 2005 Erkenntnis (1975-)
Vol. 63(1), pp. 7-31 
article  
Abstract: Minimally, causal realism (as understood here) is the view that accounts of causation in terms of mere, regular or probabilistic conjunction are unsatisfactory, and that causal phenomena are correctly associated with some form of de re necessity. Classic arguments, however, some of which date back to Sextus Empiricus and have appeared many times since, including famously in Russell, suggest that the very notion of causal realism is incoherent. In this paper I argue that if such objections seem compelling, it is only because everyday expressions concerning causal phenomena are misleading with respect to certain metaphysical details. These expressions generally make reference to the relations of events or states of affairs, but ignore or obscure the role played by causal properties. I argue that on a proposed alternative, an analysis in terms of causal processes, more refined descriptions of causal phenomena escape the charge of incoherence. Causal necessity is here located in the relations of causal properties. I distinguish this view from the recent process theories of Salmon and Dowe, which are disinterested in causal realism.;Minimally, causal realism (as understood here) is the view that accounts of causation in terms of mere, regular or probabilistic conjunction are unsatisfactory, and that causal phenomena are correctly associated with some form of de re necessity. Classic arguments, however, some of which date back to Sextus Empiricus and have appeared many times since, including famously in Russell, suggest that the very notion of causal realism is incoherent. In this paper I argue that if such objections seem compelling, it is only because everyday expressions concerning causal phenomena are misleading with respect to certain metaphysical details. These expressions generally make reference to the relations of events or states of affairs, but ignore or obscure the role played by causal properties. I argue that on a proposed alternative, an analysis in terms of causal processes, more refined descriptions of causal phenomena escape the charge of incoherence. Causal necessity is here located in the relations of causal properties. I distinguish this view from the recent process theories of Salmon and Dowe, which are disinterested in causal realism.;Minimally, causal realism (as understood here) is the view that accounts of causation in terms of mere, regular or probabilistic conjunction are unsatisfactory, and that causal phenomena are correctly associated with some form of de re necessity. Classic arguments, however, some of which date back to Sextus Empiricus and have appeared many times since, including famously in Russell, suggest that the very notion of causal realism is incoherent. In this paper I argue that if such objections seem compelling, it is only because everyday expressions concerning causal phenomena are misleading with respect to certain metaphysical details. These expressions generally make reference to the relations of events or states of affairs, but ignore or obscure the role played by causal properties. I argue that on a proposed alternative, an analysis in terms of causal processes, more refined descriptions of causal phenomena escape the charge of incoherence. Causal necessity is here located in the relations of causal properties. I distinguish this view from the recent process theories of Salmon and Dowe, which are disinterested in causal realism.;
BibTeX:
@article{chakravartty-causal-2005,
  author = {Chakravartty, Anjan},
  title = {Causal Realism: Events and Processes},
  journal = {Erkenntnis (1975-)},
  year = {2005},
  volume = {63},
  number = {1},
  pages = {7--31}
}
McEvoy, M. Causal tracking reliabilism and the Gettier problem 2014 Synthese
Vol. 191(17), pp. 4115-4130 
article  
Abstract: This paper argues that reliabilism can handle Gettier cases once it restricts knowledge producing reliable processes to those that involve a suitable causal link between the subject's belief and the fact it references. Causal tracking reliabilism (as this version of reliabilism is called) also avoids the problems that refuted the causal theory of knowledge, along with problems besetting more contemporary theories (such as virtue reliabilism and the "safety" account of knowledge). Finally, causal tracking reliabilism allows for a response to Linda Zagzebski's challenge that no theory of knowledge can both eliminate the possibility of Gettier cases while also allowing fully warranted but false beliefs.;This paper argues that reliabilism can handle Gettier cases once it restricts knowledge producing reliable processes to those that involve a suitable causal link between the subject's belief and the fact it references. Causal tracking reliabilism (as this version of reliabilism is called) also avoids the problems that refuted the causal theory of knowledge, along with problems besetting more contemporary theories (such as virtue reliabilism and the "safety" account of knowledge). Finally, causal tracking reliabilism allows for a response to Linda Zagzebski's challenge that no theory of knowledge can both eliminate the possibility of Gettier cases while also allowing fully warranted but false beliefs.; This paper argues that reliabilism can handle Gettier cases once it restricts knowledge producing reliable processes to those that involve a suitable causal link between the subject's belief and the fact it references. Causal tracking reliabilism (as this version of reliabilism is called) also avoids the problems that refuted the causal theory of knowledge, along with problems besetting more contemporary theories (such as virtue reliabilism and the "safety" account of knowledge). Finally, causal tracking reliabilism allows for a response to Linda Zagzebski's challenge that no theory of knowledge can both eliminate the possibility of Gettier cases while also allowing fully warranted but false beliefs.[PUBLICATION ABSTRACT];This paper argues that reliabilism can handle Gettier cases once it restricts knowledge producing reliable processes to those that involve a suitable causal link between the subject’s belief and the fact it references. Causal tracking reliabilism (as this version of reliabilism is called) also avoids the problems that refuted the causal theory of knowledge, along with problems besetting more contemporary theories (such as virtue reliabilism and the “safety” account of knowledge). Finally, causal tracking reliabilism allows for a response to Linda Zagzebski’s challenge that no theory of knowledge can both eliminate the possibility of Gettier cases while also allowing fully warranted but false beliefs.;
BibTeX:
@article{mcevoy-causal-2014,
  author = {McEvoy, Mark},
  title = {Causal tracking reliabilism and the Gettier problem},
  journal = {Synthese},
  year = {2014},
  volume = {191},
  number = {17},
  pages = {4115--4130}
}
Chang, H. and Cartwright, N. Causality and Realism in the EPR Experiment 1993 Erkenntnis (1975-)
Vol. 38(2), pp. 169-190 
article  
Abstract: We argue against the common view that it is impossible to give a causal account of the distant correlations that are revealed in EPR-type experiments. We take a realistic attitude about quantum mechanics which implies a willingness to modify our familiar concepts according to its teachings. We object to the argument that the violation of factorizability in EPR rules out causal accounts, since such an argument is at best based on the desire to retain a classical description of nature that consists of processes that are continuous in space and time. We also do not think special relativity prohibits the superluminal propagation of causes in EPR, for the phenomenon of quantum measurement may very well fall outside the domain of application of special relativity. It is possible to give causal accounts of EPR as long as we are willing to take quantum mechanics seriously, and we offer two such accounts.
BibTeX:
@article{chang-causality-1993,
  author = {Chang, Hasok and Cartwright, Nancy},
  title = {Causality and Realism in the EPR Experiment},
  journal = {Erkenntnis (1975-)},
  year = {1993},
  volume = {38},
  number = {2},
  pages = {169--190}
}
Salmon, W.C. Causality without Counterfactuals 1994 Philosophy of Science
Vol. 61(2), pp. 297-312 
article  
Abstract: This paper presents a drastically revised version of the theory of causality, based on analyses of causal processes and causal interactions, advocated in Salmon (1984). Relying heavily on modified versions of proposals by P. Dowe, this article answers penetrating objections by Dowe and P. Kitcher to the earlier theory. It shows how the new theory circumvents a host of difficulties that have been raised in the literature. The result is, I hope, a more satisfactory analysis of physical causality.
BibTeX:
@article{salmon-causality-1994,
  author = {Salmon, Wesley C.},
  title = {Causality without Counterfactuals},
  journal = {Philosophy of Science},
  year = {1994},
  volume = {61},
  number = {2},
  pages = {297--312}
}
Collins, J.D., Hall, E.J. and Paul, L.A. Causation and counterfactuals 2004   book  
BibTeX:
@book{collins-causation-2004,
  author = {Collins, John D. and Hall, Edward J. and Paul, L. A.},
  title = {Causation and counterfactuals},
  publisher = {MIT Press},
  year = {2004}
}
Kutach, D. Causation and its basis in fundamental physics 2013   book  
Abstract: This book is the first comprehensive attempt to solve what Hartry Field has called "the central problem in the metaphysics of causation": the problem of reconciling the need for causal notions in the special sciences with the limited role of causation in physics. If the world evolves fundamentally according to laws of physics, what place can be found for the causal regularities and principles identified by the special sciences? Douglas Kutach answers this question by invoking a novel distinction between fundamental and derivative reality and a complementary conception of reduction. He then constructs a framework that allows all causal regularities from the sciences to be rendered in terms of fundamental relations. By drawing on a methodology that focuses on explaining the results of specially crafted experiments, Kutach avoids the endless task of catering to pre-theoretical judgments about causal scenarios. This volume is a detailed case study that uses fundamental physics to elucidate causation, but technicalities are eschewed so that a wide range of philosophers can profit. The book is packed with innovations: new models of events, probability, counterfactual dependence, influence, and determinism. These lead to surprising implications for topics like Newcomb's paradox, action at a distance, Simpson's paradox, and more. Kutach explores the special connection between causation and time, ultimately providing a never-before-presented explanation for the direction of causation. Along the way, readers will discover that events cause themselves, that low barometer readings do cause thunderstorms after all, and that we humans routinely affect the past more than we affect the future.;This book is the first comprehensive attempt to solve what Hartry Field has called the central problem in the metaphysics of causation : the problem of reconciling the need for causal notions in the special sciences with the limited role of causation in physics.;This book provides a comprehensive metaphysics of causation that reconciles the need for causation in the special sciences with the limited role of causation in physics. The positive account presented is distinctive by (1) exploiting a signature distinction between fundamental and derivative reality and (2) implementing of a non-standard philosophical methodology that provides guidance as to which issues need to be addressed as part of the metaphysics and which issues can be delegated. This allows the metaphysics of causation to sidestep traditional problems whose lack of resolution philosophers ordinarily treat as fatal flaws. The account treats fundamental causation as determination or probability-fixing relations, producing events through time. The account treats derivative causation as probabilistic difference-making holding in virtue of the fundamental causal relations. Highlights of the book include (1) a novel explanation of how causal asymmetry can be derived from principles like the second law of thermodynamics. We routinely influence the past but do not notice this remarkable ability because we are unable to exert exploitable influence over the past; (2) an identification of causation-grounding principles common to classical mechanics, relativistic electromagnetism, general relativity, and quantum mechanics; (3) a new model of events and their probabilistic relations that is compatible with both deterministic and chancy theories of fundamental reality; (4) a revisionary interpretation of the causal relations present in cases of a common cause.;
BibTeX:
@book{kutach-causation-2013,
  author = {Kutach, Douglas},
  title = {Causation and its basis in fundamental physics},
  publisher = {Oxford University Press},
  year = {2013}
}
Lam, V. Causation and Space-Time 2005 History and philosophy of the life sciences
Vol. 27(3/4), pp. 465-478 
article  
Abstract: This paper considers the physical accounts of causation in terms of conserved quantities in the light of the theory of general relativity. As it is rather wellknown among physicists, there are several difficulties with the notions of conservation and localization of the (gravitational) energy-momentum within general relativity. We first begin to review the so-called conserved quantity theory of causation mainly due to Dowe and Salmon, then we discuss some consequences of these difficulties for this physical account of causation. We argue that these difficulties are due to the fundamental nature of the space-time structure as described by GR, which the conserved quantity theory of causation does not account for.; This paper considers the physical accounts of causation in terms of conserved quantities in the light of the theory of general relativity. As it is rather well-known among physicists, there are several difficulties with the notions of conservation and localization of the (gravitational) energy-momentum within general relativity. We first begin to review the so-called conserved quantity theory of causation mainly due to Dowe and Salmon, then we discuss some consequences of these difficulties for this physical account of causation. We argue that these difficulties are due to the fundamental nature of the space-time structure as described by GR, which the conserved quantity theory of causation does not account for.; This paper considers the physical accounts of causation in terms of conserved quantities in the light of the theory of general relativity. As it is rather well-known among physicists, there are several difficulties with the notions of conservation and localization of the (gravitational) energy-momentum within general relativity. We first begin to review the so-called conserved quantity theory of causation mainly due to Dowe and Salmon, then we discuss some consequences of these difficulties for this physical account of causation. We argue that these difficulties are due to the fundamental nature of the space-time structure as described by GR, which the conserved quantity theory of causation does not account for.
BibTeX:
@article{lam-causation-2005,
  author = {Lam, Vincent},
  title = {Causation and Space-Time},
  journal = {History and philosophy of the life sciences},
  year = {2005},
  volume = {27},
  number = {3/4},
  pages = {465--478}
}
Liebesman, D. Causation and the Canberra Plan 2011 Pacific Philosophical Quarterly
Vol. 92(2), pp. 232-242 
article  
Abstract: David Lewis has a general recipe for analysis: the Canberra Plan. His analyses of mind, color, and value all proceed according to the plan. What is curious is that his analysis of causation – one of his seminal analyses – doesn't. It doesn't and according to Lewis it can't. Lewis has two objections against using the Canberra Plan to analyze causation. After presenting Lewis' objections I argue that they both fail. I then draw some lessons from their failure.;David Lewis has a general recipe for analysis: the Canberra Plan. His analyses of mind, color, and value all proceed according to the plan. What is curious is that his analysis of causation - one of his seminal analyses - doesn't. It doesn't and according to Lewis it can't. Lewis has two objections against using the Canberra Plan to analyze causation. After presenting Lewis' objections I argue that they both fail. I then draw some lessons from their failure.;David Lewis has a general recipe for analysis: the Canberra Plan. His analyses of mind, color, and value all proceed according to the plan. What is curious is that his analysis of causation - one of his seminal analyses - doesn't. It doesn't and according to Lewis it can't. Lewis has two objections against using the Canberra Plan to analyze causation. After presenting Lewis' objections I argue that they both fail. I then draw some lessons from their failure. [PUBLICATION ABSTRACT];David Lewis has a general recipe for analysis: the Canberra Plan. His analyses of mind, color, and value all proceed according to the plan. What is curious is that his analysis of causation - one of his seminal analyses - doesn't. It doesn't and according to Lewis it can't. Lewis has two objections against using the Canberra Plan to analyze causation. After presenting Lewis' objections I argue that they both fail. I then draw some lessons from their failure.;
BibTeX:
@article{liebesman-causation-2011,
  author = {Liebesman, David},
  title = {Causation and the Canberra Plan},
  journal = {Pacific Philosophical Quarterly},
  year = {2011},
  volume = {92},
  number = {2},
  pages = {232--242}
}
Fair, D. Causation and the Flow of Energy 1979 Erkenntnis (1975-)
Vol. 14(3), pp. 219-250 
article  
Abstract: Causation has traditionally been analyzed either as a relation of nomic dependence or as a relation of counterfactual dependence. I argue for a third program, a physicalistic reduction of the causal relation to one of energy-momentum transference in the technical sense of physics. This physicalistic analysis is argued to have the virtues of easily handling the standard counterexamples to the nomic and counterfactual analyses, offering a plausible epistemology for our knowledge of causes, and elucidating the nature of the relation between causation and physical science.
BibTeX:
@article{fair-causation-1979,
  author = {Fair, David},
  title = {Causation and the Flow of Energy},
  journal = {Erkenntnis (1975-)},
  year = {1979},
  volume = {14},
  number = {3},
  pages = {219--250}
}
Lewis, D. Causation as influence 2000 Journal of Philosophy
Vol. 97(4), pp. 182 
article  
Abstract: Lewis has long been an advocate of counterfactual analysis of causation. Unfortunately, the simplest counterfactual analysis breaks down in cases of redundant causation.
BibTeX:
@article{lewis-causation-2000,
  author = {Lewis, David},
  title = {Causation as influence},
  journal = {Journal of Philosophy},
  year = {2000},
  volume = {97},
  number = {4},
  pages = {182}
}
Woodward, J. Causation in biology: stability, specificity, and the choice of levels of explanation 2010 Biology & Philosophy
Vol. 25(3), pp. 287-318 
article  
Abstract: This paper attempts to elucidate three characteristics of causal relationships that are important in biological contexts. Stability has to do with whether a causal relationship continues to hold under changes in background conditions. Proportionality has to do with whether changes in the state of the cause "line up" in the right way with changes in the state of the effect and with whether the cause and effect are characterized in a way that contains irrelevant detail. Specificity is connected both to David Lewis' notion of "influence" and also with the extent to which a causal relation approximates to the ideal of one cause-one effect. Interrelations among these notions and their possible biological significance are also discussed.; This paper attempts to elucidate three characteristics of causal relationships that are important in biological contexts. Stability has to do with whether a causal relationship continues to hold under changes in background conditions. Proportionality has to do with whether changes in the state of the cause "line up" in the right way with changes in the state of the effect and with whether the cause and effect are characterized in a way that contains irrelevant detail. Specificity is connected both to David Lewis' notion of "influence" and also with the extent to which a causal relation approximates to the ideal of one cause-one effect. Interrelations among these notions and their possible biological significance are also discussed.[PUBLICATION ABSTRACT];This paper attempts to elucidate three characteristics of causal relationships that are important in biological contexts. Stability has to do with whether a causal relationship continues to hold under changes in background conditions. Proportionality has to do with whether changes in the state of the cause "line up" in the right way with changes in the state of the effect and with whether the cause and effect are characterized in a way that contains irrelevant detail. Specificity is connected both to David Lewis' notion of "influence" and also with the extent to which a causal relation approximates to the ideal of one cause-one effect. Interrelations among these notions and their possible biological significance are also discussed. Keywords Cause, Stability, Levels of explanation, Specificity;
BibTeX:
@article{woodward-causation-2010,
  author = {Woodward, James},
  title = {Causation in biology: stability, specificity, and the choice of levels of explanation},
  journal = {Biology & Philosophy},
  year = {2010},
  volume = {25},
  number = {3},
  pages = {287--318}
}
Maudlin, T. Causation, Counterfactuals, and the Third Factor 2007 , pp. 143-170  incollection  
Abstract: This chapter argues that the attempt to analyse causation in terms of counterfactuals is wrong-headed in a way that no amount of fine-tuning can fix. Causation is not to be analysed in terms of counterfactual dependency at all, no matter how many equants and epicycles are appended to the original rough draft. The systematic connections between judgements about causes and judgements about counterfactuals can be explained by the involvement of a third factor, some component of the truth conditions of counterfactual claims that is also a component of the truth conditions for causal claims. This third factor would provide the analogue of a ‘common cause’ explanation for the systematic connections between causal claims and counterfactuals: neither underpins the other but the third factor underpins them both.
BibTeX:
@incollection{maudlin-causation-2007,
  author = {Maudlin, Tim},
  title = {Causation, Counterfactuals, and the Third Factor},
  publisher = {Oxford University Press},
  year = {2007},
  pages = {143--170}
}
Price, H., Corry, R. and Time, U.o.S.C.f. Causation, physics, and the constitution of reality: Russell's republic revisited 2007   inproceedings  
BibTeX:
@inproceedings{price-causation-2007,
  author = {Price, Huw and Corry, Richard and Time, University of Sydney Centre for},
  title = {Causation, physics, and the constitution of reality: Russell's republic revisited},
  publisher = {Clarendon Press},
  year = {2007}
}
Paul, L.A. and Hall, E.J. Causation: a user's guide 2013   book  
Abstract: Causation is at once familiar and mysterious. Many believe that the causal relation is not directly observable, but that we nevertheless can somehow detect its presence in the world, and much work in the natural and social sciences relies on our ability to detect it. Yet neither common sense, scientific investigation, nor extensive philosophical debate has led us to anything like agreement on the correct analysis of the concept of causation or an account of the metaphysical nature of the causal relation. Contemporary philosophical debates about causation are driven by opposing motivations, conflicting intuitions, and unarticulated methodological assumptions. Causation: A User’s Guide cuts a clear path through this confusing landscape. The book guides the reader through the most important philosophical treatments of causation, negotiating the terrain by taking a set of examples as landmarks. Special attention is given to counterfactual and related analyses of causation. Using a methodological principle based on the close examination of potential counterexamples, the book clarifies the central themes of the debate about causation, develops an account of the methodological rules one should follow when conducting a philosophical exploration of causation, and covers questions about causation involving omissions or absences, preemption and other species of redundant causation, and the possibility that causation is not transitive. Along the way, the authors examine several contemporary proposals for analyzing the nature of causation and assess their merits and overall methodological cogency, including proposals based on counterfactual analyses, regularities, causal modeling, contrasts, de facto accounts, and transference of conserved quantities. The book is designed to be of value both to trained specialists and those coming to the problem of causation for the first time. It provides the reader with a broad and sophisticated view of the metaphysics of the causal relation.;Causation is at once familiar and mysterious–we can detect its presence in the world, but we cannot agree on the metaphysics of the causal relation. L. A. Paul and Ned Hall guide the reader through the most important philosophical treatments of causation, and develop a broad and sophisticated understanding of the issues under debate.;Causation is at once familiar and mysterious. Many believe that the causal relation is not directly observable, but that we nevertheless can somehow detect its presence in the world. Common sense seems to have a firm grip on causation, and much work in the natural and social sciences relies on the idea. Yet neither common sense nor extensive philosophical debate has led us to anything like agreement on the correct analysis of the concept of causation, or an account of the metaphysical nature of the causal relation. Contemporary debates are driven by opposing motivations, conflicting intuitions, and unarticulated methodological assumptions. Causation: A User's Guide cuts a clear path through this confusing but vital landscape. L. A. Paul and Ned Hall guide the reader through the most important philosophical treatments of causation, negotiating the terrain by taking a set of examples as landmarks. Special attention is given to counterfactual and related analyses of causation. Using a methodological principle based on the close examination of potential counterexamples, they clarify the central themes of the debate about causation, and cover questions about causation involving omissions or absences, preemption and other species of redundant causation, and the possibility that causation is not transitive. Along the way, Paul and Hall examine several contemporary proposals for analyzing the nature of causation and assess their merits and overall methodological cogency. The book is designed to be of value both to trained specialists and those coming to the problem of causation for the first time. It provides the reader with a broad and sophisticated view of the metaphysics of the causal relation.;
BibTeX:
@book{paul-causation:-2013,
  author = {Paul, L. A. and Hall, Edward J.},
  title = {Causation: a user's guide},
  publisher = {Oxford University Press},
  year = {2013}
}
Dowe, P. and Noordhof, P. Cause and chance: causation in an indeterministic world 2004   book  
BibTeX:
@book{dowe-cause-2004,
  author = {Dowe, Phil and Noordhof, Paul},
  title = {Cause and chance: causation in an indeterministic world},
  publisher = {Routledge},
  year = {2004}
}
Dowe, P. Causes are physically connected to their effects: Why preventers and ommissions are not causes. 2004 Contemporary debates in philosophy of science, pp. 189-199  incollection  
BibTeX:
@incollection{dowe-causes-2004,
  author = {Dowe, Phil},
  title = {Causes are physically connected to their effects: Why preventers and ommissions are not causes.},
  booktitle = {Contemporary debates in philosophy of science},
  publisher = {Blackwell},
  year = {2004},
  pages = {189--199}
}
Schaffer, J. Causes Need Not be Physically Connected to their Effects 2004 Contemporary debates in philosophy of science, pp. 197-215  incollection  
BibTeX:
@incollection{schaffer-causes-2004,
  author = {Schaffer, Jonathan},
  title = {Causes Need Not be Physically Connected to their Effects},
  booktitle = {Contemporary debates in philosophy of science},
  publisher = {Blackwell},
  year = {2004},
  pages = {197--215}
}
Ekstrom, A.D., Kahana, M.J., Caplan, J.B., Fields, T.A., Isham, E.A., Newman, E.L. and Fried, I. Cellular networks underlying human spatial navigation 2003 Nature
Vol. 425 
article DOI URL 
BibTeX:
@article{ekstrom-cellular-2003,
  author = {Ekstrom, A. D. and Kahana, M. J. and Caplan, J. B. and Fields, T. A. and Isham, E. A. and Newman, E. L. and Fried, I.},
  title = {Cellular networks underlying human spatial navigation},
  journal = {Nature},
  year = {2003},
  volume = {425},
  url = {http://dx.doi.org/10.1038/nature01964},
  doi = {http://doi.org/10.1038/nature01964}
}
Bricmont, J. Chance in physics: foundations and perspectives 2001
Vol. 574 
book  
BibTeX:
@book{bricmont-chance-2001,
  author = {Bricmont, J.},
  title = {Chance in physics: foundations and perspectives},
  publisher = {Springer},
  year = {2001},
  volume = {574}
}
Seligman, J. Channels: From logic to probability 2009
Vol. 5363, pp. 193-233 
inproceedings  
BibTeX:
@inproceedings{seligman-channels:-2009,
  author = {Seligman, Jeremy},
  title = {Channels: From logic to probability},
  year = {2009},
  volume = {5363},
  pages = {193--233}
}
Wayne, A. Chapter 1 A Trope-Bundle Ontology for Field Theory 2008
Vol. 4, pp. 1-15 
incollection  
BibTeX:
@incollection{wayne-chapter-2008,
  author = {Wayne, Andrew},
  title = {Chapter 1 A Trope-Bundle Ontology for Field Theory},
  publisher = {Elsevier Science & Technology},
  year = {2008},
  volume = {4},
  pages = {1--15}
}
Wimsatt, W. Characterizing the Robustness of Science: After the Practice Turn in Philosophy of Science 2012
Vol. 292 
book  
BibTeX:
@book{wimsatt-characterizing-2012,
  author = {Wimsatt, William},
  title = {Characterizing the Robustness of Science: After the Practice Turn in Philosophy of Science},
  publisher = {Springer Netherlands},
  year = {2012},
  volume = {292}
}
Burch, R. Charles Sanders Peirce 2014 The Stanford Encyclopedia of Philosophy  incollection URL 
BibTeX:
@incollection{burch-charles-2014,
  author = {Burch, Robert},
  title = {Charles Sanders Peirce},
  booktitle = {The Stanford Encyclopedia of Philosophy},
  publisher = {Metaphysics Research Lab, Stanford University},
  year = {2014},
  edition = {Winter 2014},
  url = {https://plato.stanford.edu/archives/win2014/entries/peirce/}
}
Graham, D.J. Chemical thermodynamics and information theory with applications 2011   book  
BibTeX:
@book{graham-chemical-2011,
  author = {Graham, Daniel J.},
  title = {Chemical thermodynamics and information theory with applications},
  publisher = {CRC Press},
  year = {2011}
}
Floridi, L. Children of the Fourth Revolution 2011 Philosophy & Technology
Vol. 24(3), pp. 227-232 
article  
Abstract: It is a well-known fact that artificial intelligence (AI) research seeks both to reproduce the outcome of our intelligent behaviour by non-biological means, and to produce the non-biological equivalent of our intelligence. The two souls of AI, the engineering and the cognitive one, have often engaged in fratricidal feuds for intellectual predominance, academic power, and financial resources. Perhaps because ACs are neither Asimovs robots nor Hals children, the philosophical questions they posit are very concrete. When is an informational artefact a companion?
BibTeX:
@article{floridi-children-2011,
  author = {Floridi, Luciano},
  title = {Children of the Fourth Revolution},
  journal = {Philosophy & Technology},
  year = {2011},
  volume = {24},
  number = {3},
  pages = {227--232}
}
Thoreau, H.D. and NetLibrary, I. Civil disobedience 1990   book  
BibTeX:
@book{thoreau-civil-1990,
  author = {Thoreau, Henry D. and NetLibrary, Inc},
  title = {Civil disobedience},
  publisher = {Alex Catalogue},
  year = {1990}
}
Adami, C. and Steeg, G. Classical information transmission capacity of quantum black holes 2014 CLASSICAL AND QUANTUM GRAVITY
Vol. 31(7) 
article  
Abstract: The fate of classical information incident on a quantum black hole has been the subject of an ongoing controversy in theoretical physics, because a calculation within the framework of semi-classical curved-space quantum field theory appears to show that the incident information is irretrievably lost, in contradiction to time-honored principles such as time-reversibility and unitarity. Here, we show within this framework embedded in quantum communication theory that signaling from past to future null infinity in the presence of a Schwarzschild black hole can occur with arbitrary accuracy, and thus that classical information is not lost in black hole dynamics. The calculation relies on a treatment that is manifestly unitary from the outset, where probability conservation is guaranteed because black holes stimulate the emission of radiation in response to infalling matter. This stimulated radiation is non-thermal and contains all of the information about the infalling matter, while Hawking radiation contains none of it.
BibTeX:
@article{adami-classical-2014,
  author = {Adami, C. and Steeg, GV},
  title = {Classical information transmission capacity of quantum black holes},
  journal = {CLASSICAL AND QUANTUM GRAVITY},
  year = {2014},
  volume = {31},
  number = {7}
}
Peres, A. Classical interventions in quantum systems. I. The measuring process 2000 Physical Review A - Atomic, Molecular, and Optical Physics
Vol. 61(2), pp. 221161-221169 
article  
Abstract: The measuring process is an external intervention in the dynamics of a quantum system. It involves a unitary interaction of that system with a measuring apparatus, a further interaction of both with an unknown environment causing decoherence, and then the deletion of a subsystem. This description of the measuring process is a substantial generalization of current models in quantum measurement theory. In particular, no ancilla is needed. The final result is represented by a completely positive map of the quantum state \rho (possibly with a change of the dimensions of \rho). A continuous limit of the above process leads to Lindblad's equation for the quantum dynamical semigroup.;The measuring process is an external intervention in the dynamics of a quantum system. It involves a unitary interaction of that system with a measuring apparatus, a further interaction of both with an unknown environment causing decoherence, and then the deletion of a subsystem. This description of the measuring process is a substantial generalization of current models in quantum measurement theory. In particular, no ancilla is needed. The final result is represented by a completely positive map of the quantum state rho (possibly with a change of the dimensions of rho). A continuous limit of the above process leads to Lindblad's equation for the quantum-dynamical semigroup [Commun. Math. Phys. 48, 119(1976)].;
BibTeX:
@article{peres-classical-2000,
  author = {Peres, Asher},
  title = {Classical interventions in quantum systems. I. The measuring process},
  journal = {Physical Review A - Atomic, Molecular, and Optical Physics},
  year = {2000},
  volume = {61},
  number = {2},
  pages = {221161--221169}
}
Henderson, L. and Vedral, V. Classical, quantum and total correlations 2001 Journal of Physics A: Mathematical and General
Vol. 34(35), pp. 6899 
article URL 
Abstract: We discuss the problem of separating consistently the total correlations in a bipartite quantum state into a quantum and a purely classical part. A measure of classical correlations is proposed and its properties are explored.
BibTeX:
@article{henderson-classical-2001,
  author = {Henderson, L. and Vedral, V.},
  title = {Classical, quantum and total correlations},
  journal = {Journal of Physics A: Mathematical and General},
  year = {2001},
  volume = {34},
  number = {35},
  pages = {6899},
  url = {http://stacks.iop.org/0305-4470/34/i=35/a=315}
}
Shannon, C.E., Sloane, N.J.A., Wyner, A.D. and Society, I.I.T. Claude Elwood Shannon: collected papers 1993   book  
BibTeX:
@book{shannon-claude-1993,
  author = {Shannon, Claude E. and Sloane, N. J. A. and Wyner, A. D. and Society, IEEE Information Theory},
  title = {Claude Elwood Shannon: collected papers},
  publisher = {IEEE Press},
  year = {1993}
}
Biro, J.C. Coding nucleic acids are chaperons for protein folding: A novel theory of protein folding 2013 Gene
Vol. 515(2), pp. 249-257 
article  
Abstract: The arguments for nucleic acid chaperons are reviewed and three new lines of evidence are added. (1) It was found that amino acids encoded by codons in short nucleic acid loops frequently form turns and helices in the corresponding protein structures. (2) The amino acids encoded by partially complementary (1st and 3rd nucleotides) codons are more frequently co-located in the encoded proteins than expected by chance. (3) There are significant correlations between thermodynamic changes (ddG) caused by codon mutations in nucleic acids and the thermodynamic changes caused by the corresponding amino acid mutations in the encoded proteins. We conclude that the concept of the Proteomic Code and nucleic acid chaperons seems correct from the bioinformatics point of view, and we expect to see direct biochemical experiments and evidence in the near future. © 2012 Elsevier B.V.; The arguments for nucleic acid chaperons are reviewed and three new lines of evidence are added. (1) It was found that amino acids encoded by codons in short nucleic acid loops frequently form turns and helices in the corresponding protein structures. (2) The amino acids encoded by partially complementary (1st and 3rd nucleotides) codons are more frequently co-located in the encoded proteins than expected by chance. (3) There are significant correlations between thermodynamic changes (ddG) caused by codon mutations in nucleic acids and the thermodynamic changes caused by the corresponding amino acid mutations in the encoded proteins. We conclude that the concept of the Proteomic Code and nucleic acid chaperons seems correct from the bioinformatics point of view, and we expect to see direct biochemical experiments and evidence in the near future.; The arguments for nucleic acid chaperons are reviewed and three new lines of evidence are added. (1) It was found that amino acids encoded by codons in short nucleic acid loops frequently form turns and helices in the corresponding protein structures. (2) The amino acids encoded by partially complementary (1st and 3rd nucleotides) codons are more frequently co-located in the encoded proteins than expected by chance. (3) There are significant correlations between thermodynamic changes (ddG) caused by codon mutations in nucleic acids and the thermodynamic changes caused by the corresponding amino acid mutations in the encoded proteins. We conclude that the concept of the Proteomic Code and nucleic acid chaperons seems correct from the bioinformatics point of view, and we expect to see direct biochemical experiments and evidence in the near future.
BibTeX:
@article{biro-coding-2013,
  author = {Biro, Jan C.},
  title = {Coding nucleic acids are chaperons for protein folding: A novel theory of protein folding},
  journal = {Gene},
  year = {2013},
  volume = {515},
  number = {2},
  pages = {249--257}
}
Boucheron, S., Garivier, A. and Gassiat, E. Coding on Countably Infinite Alphabets 2009 IEEE Transactions on Information Theory
Vol. 55(1), pp. 358-373 
article  
Abstract: This paper describes universal lossless coding strategies for compressing sources on countably infinite alphabets. Classes of memoryless sources defined by an envelope condition on the marginal distribution provide benchmarks for coding techniques originating from the theory of universal coding over finite alphabets. We prove general upper bounds on minimax regret and lower bounds on minimax redundancy for such source classes. The general upper bounds emphasize the role of the normalized maximum likelihood (NML) codes with respect to minimax regret in the infinite alphabet context. Lower bounds are derived by tailoring sharp bounds on the redundancy of Krichevsky-Trofimov coders for sources over finite alphabets. Up to logarithmic (resp., constant) factors the bounds are matching for source classes defined by algebraically declining (resp., exponentially vanishing) envelopes. Effective and (almost) adaptive coding techniques are described for the collection of source classes defined by algebraically vanishing envelopes. Those results extend our knowledge concerning universal coding to contexts where the key tools from parametric inference are known to fail. [PUBLICATION ABSTRACT];This paper describes universal lossless coding strategies for compressing sources on countably infinite alphabets. Classes of memoryless sources defined by an envelope condition on the marginal distribution provide benchmarks for coding techniques originating from the theory of universal coding over finite alphabets. We prove general upper bounds on minimax regret and lower bounds on minimax redundancy for such source classes. The general upper bounds emphasize the role of the normalized maximum likelihood (NML) codes with respect to minimax regret in the infinite alphabet context. Lower bounds are derived by tailoring sharp bounds on the redundancy of Krichevsky-Trofimov coders for sources over finite alphabets. Up to logarithmic (resp., constant) factors the bounds are matching for source classes defined by algebraically declining (resp., exponentially vanishing) envelopes. Effective and (almost) adaptive coding techniques are described for the collection of source classes defined by algebraically vanishing envelopes. Those results extend our knowledge concerning universal coding to contexts where the key tools from parametric inference are known to fail.;This paper describes universal lossless coding strategies for compressing sources on countably infinite alphabets. Classes of memoryless sources defined by an envelope condition on the marginal distribution provide benchmarks for coding techniques originating from the theory of universal coding over finite alphabets. We prove general upper bounds on minimax regret and lower bounds on minimax redundancy for such source classes. The general upper bounds emphasize the role of the normalized maximum likelihood (NML) codes with respect to minimax regret in the infinite alphabet context. Lower bounds are derived by tailoring sharp bounds on the redundancy of Krichevsky-Trofimov coders for sources over finite alphabets. Up to logarithmic (resp., constant) factors the bounds are matching for source classes defined by algebraically declining (resp., exponentially vanishing) envelopes. Effective and (almost) adaptive coding techniques are described for the collection of source classes defined by algebraically vanishing envelopes. Those results extend our knowledge concerning universal coding to contexts where the key tools from parametric inference are known to fail.; This paper describes universal lossless coding strategies for compressing sources on countably infinite alphabets. Classes of memoryless sources defined by an envelope condition on the marginal distribution provide benchmarks for coding techniques originating from the theory of universal coding over finite alphabets. We prove general upper bounds on minimax regret and lower bounds on minimax redundancy for such source classes. The general upper bounds emphasize the role of the normalized maximum likelihood (NML) codes with respect to minimax regret in the infinite alphabet context. Lower bounds are derived by tailoring sharp bounds on the redundancy of Krichevsky-Trofimov coders for sources over finite alphabets. Up to logarithmic (resp., constant) factors the bounds are matching for source classes defined by algebraically declining (resp., exponentially vanishing) envelopes. Effective and (almost) adaptive coding techniques are described for the collection of source classes defined by algebraically vanishing envelopes. Those results extend our knowledge concerning universal coding to contexts where the key tools from parametric inference are known to fail. [PUBLICATION ABSTRACT];
BibTeX:
@article{boucheron-coding-2009,
  author = {Boucheron, S. and Garivier, A. and Gassiat, E.},
  title = {Coding on Countably Infinite Alphabets},
  journal = {IEEE Transactions on Information Theory},
  year = {2009},
  volume = {55},
  number = {1},
  pages = {358--373}
}
Aydede, M. and Güzeldere, G. Cognitive Architecture, Concepts, and Introspection: An Information‐Theoretic Solution to the Problem of Phenomenal Consciousness 2005 Noûs
Vol. 39(2), pp. 197-255 
article  
BibTeX:
@article{aydede-cognitive-2005,
  author = {Aydede, Murat and Güzeldere, Güven},
  title = {Cognitive Architecture, Concepts, and Introspection: An Information‐Theoretic Solution to the Problem of Phenomenal Consciousness},
  journal = {Noûs},
  year = {2005},
  volume = {39},
  number = {2},
  pages = {197--255}
}
Peres, A. Collective tests for quantum nonlocality 1996 Physical Review A - Atomic, Molecular, and Optical Physics
Vol. 54(4), pp. 2685-2689 
article  
Abstract: Pairs of spin-1/2 particles are prepared in a Werner state (namely, a mixture of singlet and random components). If the random component is large enough, the statistical results of spin measurements that may be performed on each pair separately can be reproduced by an algorithm involving local “hidden” variables. However, if several such pairs are tested simultaneously, a violation of the Clauser-Horne-Shimony-Holt inequality may occur, and no local hidden variable model is compatible with the results.
BibTeX:
@article{peres-collective-1996,
  author = {Peres, Asher},
  title = {Collective tests for quantum nonlocality},
  journal = {Physical Review A - Atomic, Molecular, and Optical Physics},
  year = {1996},
  volume = {54},
  number = {4},
  pages = {2685--2689}
}
Maruyama, K., Nori, F. and Vedral, V. Colloquium: the physics of Maxwell's demon and information 2009 Reviews of Modern Physics
Vol. 81(1), pp. 1 
article  
Abstract: The history of Maxwell's demon and a number of interesting consequences of the second law of thermodynamics in quantum mechanics and the theory of gravity are reviewed. Maxwell introduced his demon in 1871 as a thought experiment to demonstrate that the second law is a statistical principle that almost always holds, rather than an absolute law. The demon is used as the starting point of a discussion of the erasure principle, before the implications of the second law for the wave nature of light, Gibbs' paradox and the quantum superposition principle, quantum state discrimination, linearity in quantum dynamics, general relativity, and the Einstein equation from thermodynamics are discussed. The erasure of classical information encoded in quantum states, the thermodynamic derivation of the Holevo bound, entanglement detection using the demon, and physical implementations of the demon are then explored.
BibTeX:
@article{maruyama-colloquium:-2009,
  author = {Maruyama, Koji and Nori, Franco and Vedral, Vlatko},
  title = {Colloquium: the physics of Maxwell's demon and information},
  journal = {Reviews of Modern Physics},
  year = {2009},
  volume = {81},
  number = {1},
  pages = {1}
}
Gillen, P. and Ghosh, D. Colonialism & modernity 2007   book  
BibTeX:
@book{gillen-colonialism-2007,
  author = {Gillen, Paul and Ghosh, Devleena},
  title = {Colonialism & modernity},
  publisher = {University of New South Wales Press},
  year = {2007}
}
Plum, F. Coma and related global disturbances of the human conscious state 1991   book URL 
BibTeX:
@book{plum-coma-1991,
  author = {Plum, F.},
  title = {Coma and related global disturbances of the human conscious state},
  publisher = {Plenum Press},
  year = {1991},
  url = {http://dx.doi.org/10.1007/978-1-4615-6622-9-9}
}
Smith, J.M. Commentary on Kerr and Godfrey-Smith 2002 Biology and Philosophy
Vol. 17(4), pp. 523-527 
article  
BibTeX:
@article{smith-commentary-2002,
  author = {Smith, John M.},
  title = {Commentary on Kerr and Godfrey-Smith},
  journal = {Biology and Philosophy},
  year = {2002},
  volume = {17},
  number = {4},
  pages = {523--527}
}
Millikan, R.G. Compare and Contrast Dreske, Fodor, and Millikan on Teleosemantics 1990 Philosophical Topics
Vol. 18(2), pp. 151 
article  
BibTeX:
@article{millikan-compare-1990,
  author = {Millikan, Ruth G.},
  title = {Compare and Contrast Dreske, Fodor, and Millikan on Teleosemantics},
  journal = {Philosophical Topics},
  year = {1990},
  volume = {18},
  number = {2},
  pages = {151}
}
Rooij, R.V. Comparing questions and answers:A bit of logic, a bit of language, and some bits of information 2009
Vol. 5363, pp. 161-192 
inproceedings  
BibTeX:
@inproceedings{rooij-comparing-2009,
  author = {Rooij, Robert V.},
  title = {Comparing questions and answers:A bit of logic, a bit of language, and some bits of information},
  year = {2009},
  volume = {5363},
  pages = {161--192}
}
Busch, P., Heinosaari, T., Schultz, J. and Stevens, N. Comparing the degrees of incompatibility inherent in probabilistic physical theories 2013 EPL (Europhysics Letters)
Vol. 103(1), pp. 10002 
article URL 
Abstract: We introduce a new way of quantifying the degrees of incompatibility of two observables in a probabilistic physical theory and, based on this, a global measure of the degree of incompatibility inherent in such theories, across all observable pairs. This opens up a novel and flexible way of comparing probabilistic theories with respect to the nonclassical feature of incompatibility, raising many interesting questions, some of which will be answered here. We show that quantum theory contains observables that are as incompatible as any probabilistic physical theory can have if arbitrary pairs of observables are considered. If one adopts a more refined measure of the degree of incompatibility, for instance, by restricting the comparison to binary observables, it turns out that there are probabilistic theories whose inherent degrees of incompatibility are greater than that of quantum mechanics.
BibTeX:
@article{busch-comparing-2013,
  author = {Busch, Paul and Heinosaari, Teiko and Schultz, Jussi and Stevens, Neil},
  title = {Comparing the degrees of incompatibility inherent in probabilistic physical theories},
  journal = {EPL (Europhysics Letters)},
  year = {2013},
  volume = {103},
  number = {1},
  pages = {10002},
  url = {http://stacks.iop.org/0295-5075/103/i=1/a=10002}
}
Greenberger, D.M., Hentschel, K. and Weinert, F. Compendium of quantum physics: concepts, experiments, history, and philosophy 2009   book  
BibTeX:
@book{greenberger-compendium-2009,
  author = {Greenberger, Daniel M. and Hentschel, Klaus and Weinert, Friedel},
  title = {Compendium of quantum physics: concepts, experiments, history, and philosophy},
  publisher = {Springer},
  year = {2009}
}
Wimsatt, W.C. Complexity and Organization 1972 PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association
Vol. 1972, pp. 67-86 
article  
BibTeX:
@article{wimsatt-complexity-1972,
  author = {Wimsatt, William C.},
  title = {Complexity and Organization},
  journal = {PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association},
  year = {1972},
  volume = {1972},
  pages = {67--86}
}
Zurek, W.H. Complexity, entropy, and the physics of information: the proceedings of the 1988 Workshop on Complexity, Entropy, and the Physics of Information held May-June, 1989, in Santa Fe, New Mexico 1990
Vol. 8. 
inproceedings  
BibTeX:
@inproceedings{zurek-complexity-1990,
  author = {Zurek, Wojciech H.},
  title = {Complexity, entropy, and the physics of information: the proceedings of the 1988 Workshop on Complexity, Entropy, and the Physics of Information held May-June, 1989, in Santa Fe, New Mexico},
  publisher = {Addison-Wesley Pub. Co},
  year = {1990},
  volume = {8.}
}
Bhaumik, M. Comprehending Quantum Theory from Quantum Fields 2013   article  
Abstract: At the primary level of reality as described by quantum field theory, a fundamental particle like an electron represents a stable, discrete, propagating excited state of its underlying quantum field. QFT also tells us that the lowest vacuum state as well as the excited states of such a field is always very active with spontaneous, unpredictable quantum fluctuations. Also an underlying quantum field is known to be indestructible and immutable possessing the same value in each element of spacetime comprising the universe. These characteristics of the primary quantum fields together with the fact that the quantum fluctuations can be cogently substantiated to be quantum coherent throughout the universe provide a possible ontology of the quantum theory. In this picture, the wave function of a quantum particle represents the reality of the inherent quantum fluctuations at the core of the universe and endows the particle its counter intuitive quantum behavior.
BibTeX:
@article{bhaumik-comprehending-2013,
  author = {Bhaumik, Mani},
  title = {Comprehending Quantum Theory from Quantum Fields},
  year = {2013}
}
OŁdziej, S., Czaplewski, C., Liwo, A., Vila, J.A. and Scheraga, H.A. Computation of Structure, Dynamics, and Thermodynamics of Proteins 2012 Comprehensive Biophysics, pp. 494 - 513  incollection URL 
Abstract: In addition to wet laboratory experiments and theory, molecular simulations have emerged as the third methodology for studying molecular systems. Nowadays, they are indispensable for interpreting experimental results and understanding the molecular origin of biological processes. In this chapter, the techniques for simulating protein structure, dynamics, and thermodynamics are presented. The force fields used in simulations, both in all-atom and in coarse-grained representations, are discussed in terms of the complication and applicability of the models, including treatment of the solvent and the dependence of the accuracy of the treatment and time/size scale covered by each type of model. An overview of simulation techniques – global optimization of conformational energy, Monte Carlo, and molecular dynamics methods, as well as generalized ensemble methods – is presented in the context of their applications. Finally, use of experimental information, mainly from nuclear magnetic resonance data, in the form of restraints imposed during simulations or filters used to postprocess the results, is discussed.
BibTeX:
@incollection{oldziej-computation-2012,
  author = {OŁdziej, S. and Czaplewski, C. and Liwo, A. and Vila, J. A. and Scheraga, H. A.},
  title = {Computation of Structure, Dynamics, and Thermodynamics of Proteins},
  booktitle = {Comprehensive Biophysics},
  publisher = {Elsevier},
  year = {2012},
  pages = {494 -- 513},
  url = {http://www.sciencedirect.com/science/article/pii/B9780123749208001260}
}
Shalizi, C.R. and Crutchfield, J.P. Computational mechanics: Pattern and prediction, structure and simplicity 2001 Journal of Statistical Physics
Vol. 104 
article DOI URL 
BibTeX:
@article{shalizi-computational-2001,
  author = {Shalizi, C. R. and Crutchfield, J. P.},
  title = {Computational mechanics: Pattern and prediction, structure and simplicity},
  journal = {Journal of Statistical Physics},
  year = {2001},
  volume = {104},
  url = {http://dx.doi.org/10.1023/A:1010388907793},
  doi = {http://doi.org/10.1023/A:1010388907793}
}
Chaitin, G.J. Computers, paradoxes and the foundations of mathematics 2002 American Scientist [H.W. Wilson - GS]
Vol. 90(2), pp. 164 
article  
Abstract: The consideration of the inherent paradoxes and randomness in mathematics by some of the great thinkers of the 20th century has benefited computer science. At the start of the 20th century, the well-known German mathematician David Hilbert proposed formalizing completely all of mathematical reasoning.Hilbert's proposal was driven by the logical paradoxes that his contemporary Bertrand Russell had highlighted; that is, cases where reasoning that seems to be sound leads to contradictions. However, Kurt Godel and Alan Turing showed that mathematical reasoning cannot be formalized. Their work was taken a step further by the author whose research demonstrates that randomness is natural and inevitable in mathematics. Nevertheless, formalism has been one of the biggest boons of the 20th century for programming, for calculating, and for computing.
BibTeX:
@article{chaitin-computers-2002,
  author = {Chaitin, Gregory J.},
  title = {Computers, paradoxes and the foundations of mathematics},
  journal = {American Scientist [H.W. Wilson - GS]},
  year = {2002},
  volume = {90},
  number = {2},
  pages = {164}
}
Dodig Crnkovic, G. and Giovagnoli, R. Computing nature: Turing centenary perspective 2013
Vol. 7 
book  
BibTeX:
@book{dodig-crnkovic-computing-2013,
  author = {Dodig Crnkovic, Gordana and Giovagnoli, Raffaela},
  title = {Computing nature: Turing centenary perspective},
  publisher = {Springer},
  year = {2013},
  volume = {7}
}
Braddon-Mitchell, D. and Nola, R. Conceptual analysis and philosophical naturalism 2009   book  
BibTeX:
@book{braddon-mitchell-conceptual-2009,
  author = {Braddon-Mitchell, David and Nola, Robert},
  title = {Conceptual analysis and philosophical naturalism},
  publisher = {MIT Press},
  year = {2009}
}
Chalmers, D.J. and Jackson, F. Conceptual Analysis and Reductive Explanation 2001 The Philosophical Review
Vol. 110(3), pp. 315-360 
article  
Abstract: Chalmers and Jackson argue that ordinary truths about macroscopic natural phenomena are entailed a priori by the combination of physical truths, phenomenal truths, indexical truths, and a that's-all statement. They also argue that reductive explanation requires a priori entailment, and that physicalism requires a prior entailment.
BibTeX:
@article{chalmers-conceptual-2001,
  author = {Chalmers, David J. and Jackson, Frank},
  title = {Conceptual Analysis and Reductive Explanation},
  journal = {The Philosophical Review},
  year = {2001},
  volume = {110},
  number = {3},
  pages = {315--360}
}
Martin, C.A. Conceptual Development of 20th Century Field Theories 2000
Vol. 51(3) 
book  
Abstract: "Conceptual Developments of 20th Century Field Theories" by Tian Yu Cao is reviewed.; "Conceptual Developments of 20th Century Field Theories" by Tian Yu Cao is reviewed.
BibTeX:
@book{martin-conceptual-2000,
  author = {Martin, Christopher A.},
  title = {Conceptual Development of 20th Century Field Theories},
  year = {2000},
  volume = {51},
  number = {3}
}
Cao, T.Y. Conceptual developments of 20th century field theories 1997   book  
BibTeX:
@book{cao-conceptual-1997,
  author = {Cao, Tian Y.},
  title = {Conceptual developments of 20th century field theories},
  publisher = {Cambridge University Press},
  year = {1997}
}
Christensen, J. Conceptual frameworks of accounting from an information perspective 2010 Accounting and Business Research
Vol. 40(3), pp. 287-299 
article  
Abstract: This paper analyses the benefits of accounting regulation and a conceptual framework using an information economics approach that allows consideration of uncertainty, multiple agents, demand for information, and multiple information sources. It also allows private information to enter the analysis. The analysis leads to a set of fundamental properties of accounting information. It is argued that the set of qualitative characteristics typically contained in conceptual frameworks does not adequately aggregate the information demands of users of accounting information. For example, the IASB's conceptual framework contains no guidelines for the trade-off between relevance and reliability. Furthermore, neutrality might not be part of an optimal regulation. The statistical bias introduced by the stewardship use of accounting information is not necessarily undesirable and will always remain; stewardship is the characteristic of accounting information that provides incentives for management to act in the desired way. Accounting information is inherently late compared to other information sources but influences and constrains the content of more timely sources. The accounting system does not exist in a vacuum. Other information sources are present and the purpose of the accounting system cannot be analysed without considering the existence of other information sources. Finally, financial statements are audited by an independent auditor. This implies that accounting data are hard to manipulate.;This paper analyses the benefits of accounting regulation and a conceptual framework using an information economics approach that allows consideration of uncertainty, multiple agents, demand for information, and multiple information sources. It also allows private information to enter the analysis. The analysis leads to a set of fundamental properties of accounting information. It is argued that the set of qualitative characteristics typically contained in conceptual frameworks does not adequately aggregate the information demands of users of accounting information. For example, the IASB's conceptual framework contains no guidelines for the trade-off between relevance and reliability. Furthermore, neutrality might not be part of an optimal regulation. The statistical bias introduced by the stewardship use of accounting information is not necessarily undesirable and will always remain; stewardship is the characteristic of accounting information that provides incentives for management to act in the desired way. Accounting information is inherently late compared to other information sources but influences and constrains the content of more timely sources. The accounting system does not exist in a vacuum. Other information sources are present and the purpose of the accounting system cannot be analysed without considering the existence of other information sources. Finally, financial statements are audited by an independent auditor. This implies that accounting data are hard to manipulate.;This paper analyses the benefits of accounting regulation and a conceptual framework using an information economics approach that allows consideration of uncertainty, multiple agents, demand for information, and multiple information sources. It also allows private information to enter the analysis. The analysis leads to a set of fundamental properties of accounting information. It is argued that the set of qualitative characteristics typically contained in conceptual frameworks does not adequately aggregate the information demands of users of accounting information. For example, the IASB's conceptual framework contains no guidelines for the trade-off between relevance and reliability. Furthermore, neutrality might not be part of an optimal regulation. The statistical bias introduced by the stewardship use of accounting information is not necessarily undesirable and will always remain; stewardship is the characteristic of accounting information that provides incentives for management to act in the desired way. Accounting information is inherently late compared to other information sources but influences and constrains the content of more timely sources. The accounting system does not exist in a vacuum. Other information sources are present and the purpose of the accounting system cannot be analysed without considering the existence of other information sources. Finally, financial statements are audited by an independent auditor. This implies that accounting data are hard to manipulate. [PUBLICATION ABSTRACT];
BibTeX:
@article{christensen-conceptual-2010,
  author = {Christensen, John},
  title = {Conceptual frameworks of accounting from an information perspective},
  journal = {Accounting and Business Research},
  year = {2010},
  volume = {40},
  number = {3},
  pages = {287--299}
}
Bain, J. Condensed Matter Physics and the Nature of Spacetime 2008
Vol. 4The Ontology of Spacetime II, pp. 301 - 329 
incollection URL 
BibTeX:
@incollection{bain-condensed-2008,
  author = {Bain, Jonathan},
  title = {Condensed Matter Physics and the Nature of Spacetime},
  booktitle = {The Ontology of Spacetime II},
  publisher = {Elsevier},
  year = {2008},
  volume = {4},
  pages = {301 -- 329},
  url = {http://www.sciencedirect.com/science/article/pii/S1871177408000168}
}
Crupi, V. Confirmation 2016 The Stanford Encyclopedia of Philosophy  incollection URL 
BibTeX:
@incollection{crupi-confirmation-2016,
  author = {Crupi, Vincenzo},
  title = {Confirmation},
  booktitle = {The Stanford Encyclopedia of Philosophy},
  publisher = {Metaphysics Research Lab, Stanford University},
  year = {2016},
  edition = {Winter 2016},
  url = {https://plato.stanford.edu/archives/win2016/entries/confirmation/}
}
Popper Karl R., S. Conjectures and refutations: the growth of scientific knowledge 1972   book  
BibTeX:
@book{popper-conjectures-1972,
  author = {Popper, Karl R., Sir},
  title = {Conjectures and refutations: the growth of scientific knowledge},
  publisher = {Routledge & K. Paul},
  year = {1972},
  edition = {4th (rev.).}
}
Zeman, A. Consciousness 2001 Brain
Vol. 124 
article DOI URL 
BibTeX:
@article{zeman-consciousness-2001,
  author = {Zeman, A.},
  title = {Consciousness},
  journal = {Brain},
  year = {2001},
  volume = {124},
  url = {http://dx.doi.org/10.1093/brain/124.7.1263},
  doi = {http://doi.org/10.1093/brain/124.7.1263}
}
Tononi, G. and Edelman, G.M. Consciousness and complexity 1998 Science
Vol. 282 
article DOI URL 
BibTeX:
@article{tononi-consciousness-1998,
  author = {Tononi, G. and Edelman, G. M.},
  title = {Consciousness and complexity},
  journal = {Science},
  year = {1998},
  volume = {282},
  url = {http://dx.doi.org/10.1126/science.282.5395.1846},
  doi = {http://doi.org/10.1126/science.282.5395.1846}
}
Crick, F. and Koch, C. Consciousness and neuroscience 1998 Cereb Cortex
Vol. 8 
article DOI URL 
BibTeX:
@article{crick-consciousness-1998,
  author = {Crick, F. and Koch, C.},
  title = {Consciousness and neuroscience},
  journal = {Cereb Cortex},
  year = {1998},
  volume = {8},
  url = {http://dx.doi.org/10.1093/cercor/8.2.97},
  doi = {http://doi.org/10.1093/cercor/8.2.97}
}
Singer, W. Consciousness and the binding problem 2001 Ann N Y Acad Sci
Vol. 929 
article DOI URL 
BibTeX:
@article{singer-consciousness-2001,
  author = {Singer, W.},
  title = {Consciousness and the binding problem},
  journal = {Ann N Y Acad Sci},
  year = {2001},
  volume = {929},
  url = {http://dx.doi.org/10.1111/j.1749-6632.2001.tb05712.x},
  doi = {http://doi.org/10.1111/j.1749-6632.2001.tb05712.x}
}
Tononi, G. Consciousness and the brain: Theoretical aspects 2004   book  
BibTeX:
@book{tononi-consciousness-2004,
  author = {Tononi, G.},
  title = {Consciousness and the brain: Theoretical aspects},
  year = {2004}
}
Tegmark, M. Consciousness as a state of matter 2015 Chaos, Solitons & Fractals
Vol. 76, pp. 238-270 
article  
BibTeX:
@article{tegmark-consciousness-2015,
  author = {Tegmark, Max},
  title = {Consciousness as a state of matter},
  journal = {Chaos, Solitons & Fractals},
  year = {2015},
  volume = {76},
  pages = {238--270}
}
Dennett, D.C. Consciousness explained 1993   book  
BibTeX:
@book{dennett-consciousness-1993,
  author = {Dennett, D. C.},
  title = {Consciousness explained},
  publisher = {Penguin},
  year = {1993}
}
Sperry, R. Consciousness, personal identity and the divided brain 1984 Neuropsychologia
Vol. 22 
article DOI URL 
BibTeX:
@article{sperry-consciousness-1984,
  author = {Sperry, R.},
  title = {Consciousness, personal identity and the divided brain},
  journal = {Neuropsychologia},
  year = {1984},
  volume = {22},
  url = {http://dx.doi.org/10.1016/0028-3932(84)90093-9},
  doi = {http://doi.org/10.1016/0028-3932(84)90093-9}
}
Rorty, R. Consequences of pragmatism: essays, 1972-1980 1982 School: University of Minnesota Press  techreport  
BibTeX:
@techreport{rorty-consequences-1982,
  author = {Rorty, Richard},
  title = {Consequences of pragmatism: essays, 1972-1980},
  school = {University of Minnesota Press},
  year = {1982}
}
Cator, E. and Landsman, K. Constraints on Determinism: Bell Versus Conway-Kochen 2014 Foundations of Physics
Vol. 44(7), pp. 781-791 
article  
Abstract: Bell's Theorem from 1964 and the (Strong) Free Will Theorem of Conway and Kochen from 2009 both exclude deterministic hidden variable theories (or, in modern parlance, `ontological models') that are compatible with some small fragment of quantum mechanics, admit `free' settings of the archetypal Alice & Bob experiment, and satisfy a locality condition called Parameter Independence. We clarify the relationship between these theorems by giving reformulations of both that exactly pinpoint their resemblance and their differences. Our reformulation imposes determinism in what we see as the only consistent way, in which the `ontological state' initially determines both the settings and the outcome of the experiment. The usual status of the settings as `free' parameters is subsequently recovered from independence assumptions on the pertinent (random) variables. Our reformulation also clarifies the role of the settings in Bell's later generalization of his theorem to stochastic hidden variable theories.;Bell's Theorem from Physics 36:1-28 (1964) and the (Strong) Free Will Theorem of Conway and Kochen from Notices AMS 56:226-232 (2009) both exclude deterministic hidden variable theories (or, in modern parlance, 'ontological models') that are compatible with some small fragment of quantum mechanics, admit 'free' settings of the archetypal Alice and Bob experiment, and satisfy a locality condition akin to parameter independence. We clarify the relationship between these theorems by giving reformulations of both that exactly pinpoint their resemblance and their differences. Our reformulation imposes determinism in what we see as the only consistent way, in which the 'ontological state' initially determines both the settings and the outcome of the experiment. The usual status of the settings as 'free' parameters is subsequently recovered from independence assumptions on the pertinent (random) variables. Our reformulation also clarifies the role of the settings in Bell's later generalization of his theorem to stochastic hidden variable theories.;Bell’s Theorem from Physics 36:1–28 (1964) and the (Strong) Free Will Theorem of Conway and Kochen from Notices AMS 56:226–232 (2009) both exclude deterministic hidden variable theories (or, in modern parlance, ‘ontological models’) that are compatible with some small fragment of quantum mechanics, admit ‘free’ settings of the archetypal Alice and Bob experiment, and satisfy a locality condition akin to parameter independence. We clarify the relationship between these theorems by giving reformulations of both that exactly pinpoint their resemblance and their differences. Our reformulation imposes determinism in what we see as the only consistent way, in which the ‘ontological state’ initially determines both the settings and the outcome of the experiment. The usual status of the settings as ‘free’ parameters is subsequently recovered from independence assumptions on the pertinent (random) variables. Our reformulation also clarifies the role of the settings in Bell’s later generalization of his theorem to stochastic hidden variable theories.;
BibTeX:
@article{cator-constraints-2014,
  author = {Cator, E. and Landsman, K.},
  title = {Constraints on Determinism: Bell Versus Conway-Kochen},
  journal = {Foundations of Physics},
  year = {2014},
  volume = {44},
  number = {7},
  pages = {781--791}
}
Monton, B. and Mohler, C. Constructive Empiricism 2012 The Stanford Encyclopedia of Philosophy  misc URL 
BibTeX:
@misc{monton-constructive-2012,
  author = {Monton, Bradley and Mohler, Chad},
  title = {Constructive Empiricism},
  journal = {The Stanford Encyclopedia of Philosophy},
  publisher = {Center for the Study of Language and Information (CSLI), Stanford University},
  year = {2012},
  url = {http://plato.stanford.edu/entries/constructive-empiricism/}
}
Monton, B. and Mohler, C. Constructive Empiricism 2014 The Stanford Encyclopedia of Philosophy  incollection URL 
BibTeX:
@incollection{monton-constructive-2014,
  author = {Monton, Bradley and Mohler, Chad},
  title = {Constructive Empiricism},
  booktitle = {The Stanford Encyclopedia of Philosophy},
  year = {2014},
  edition = {Spring 2014},
  url = {http://plato.stanford.edu/archives/spr2014/entries/constructive-empiricism/}
}
Ladyman, J. Constructive Empiricism and Modal Metaphysics: A Reply to Monton and van Fraassen 2004 The British Journal for the Philosophy of Science
Vol. 55(4), pp. 755-765 
article  
Abstract: In this journal [2000], I argued that Bas van Fraassen's constructive empiricism was undermined in various ways by his antirealism about modality. Here I offer some comments and responses to the reply to my arguments by Bradley Monton and van Fraassen [2003]. In particular, after making some minor points, I argue that Monton and van Fraassen have not done enough to show that the context dependence of counterfactuals renders their truth conditions non-objective, and I also argue that adopting modal realism does after all undermine the motivation for constructive empiricism. [PUBLICATION ABSTRACT]; In this journal [2000], I argued that Bas van Fraassen's constructive empiricism was undermined in various ways by his antirealism about modality. Here I offer some comments and responses to the reply to my arguments by Bradley Monton and van Fraassen [2003]. In particular, after making some minor points, I argue that Monton and van Fraassen have not done enough to show that the context dependence of counterfactuals renders their truth conditions non-objective, and I also argue that adopting modal realism does after all undermine the motivation for constructive empiricism.; In this journal [2000], I argued that Bas van Fraassen's constructive empiricism was undermined in various ways by his antirealism about modality. Here I offer some comments and responses to the reply to my arguments by Bradley Monton and van Fraassen [2003]. In particular, after making some minor points, I argue that Monton and van Fraassen have not done enough to show that the context dependence of counterfactuals renders their truth conditions non-objective, and I also argue that adopting modal realism does after all undermine the motivation for constructive empiricism. 1 Introduction 2 Underdetermination and epistemic modesty 3 Counterfactual observations 4 Modal realism and constructive empiricism.; In this journal [2000], I argued that Bas van Fraassen's constructive empiricism was undermined in various ways by his antirealism about modality. Here I offer some comments and responses to the reply to my arguments by Bradley Monton and van Fraassen [2003]. In particular, after making some minor points, I argue that Monton and van Fraassen have not done enough to show that the context dependence of counterfactuals renders their truth conditions non-objective, and I also argue that adopting modal realism does after all undermine the motivation for constructive empiricism. [PUBLICATION ABSTRACT]
BibTeX:
@article{ladyman-constructive-2004,
  author = {Ladyman, James},
  title = {Constructive Empiricism and Modal Metaphysics: A Reply to Monton and van Fraassen},
  journal = {The British Journal for the Philosophy of Science},
  year = {2004},
  volume = {55},
  number = {4},
  pages = {755--765}
}
Fraassen, V. and Bas, C. Constructive Empiricism Now 2001 Philosophical Studies: An International Journal for Philosophy in the Analytic Tradition
Vol. 106(1/2), pp. 151-170 
article  
BibTeX:
@article{van-fraassen-constructive-2001,
  author = {Van Fraassen, Bas C.},
  title = {Constructive Empiricism Now},
  journal = {Philosophical Studies: An International Journal for Philosophy in the Analytic Tradition},
  year = {2001},
  volume = {106},
  number = {1/2},
  pages = {151--170}
}
Dicken, P. Constructive empiricism: epistemology and the philosophy of science 2010   book  
BibTeX:
@book{dicken-constructive-2010,
  author = {Dicken, Paul},
  title = {Constructive empiricism: epistemology and the philosophy of science},
  publisher = {Palgrave Macmillan},
  year = {2010}
}
Bagnoli, C. Constructivism in Metaethics 2016 The Stanford Encyclopedia of Philosophy  incollection URL 
BibTeX:
@incollection{bagnoli-constructivism-2016,
  author = {Bagnoli, Carla},
  title = {Constructivism in Metaethics},
  booktitle = {The Stanford Encyclopedia of Philosophy},
  publisher = {Metaphysics Research Lab, Stanford University},
  year = {2016},
  edition = {Winter 2016},
  url = {https://plato.stanford.edu/archives/win2016/entries/constructivism-metaethics/}
}
Deutsch, D. Constructor theory 2013 Synthese
Vol. 190(18), pp. 4331-4359 
article  
Abstract: Constructor theory seeks to express all fundamental scientific theories in terms of a dichotomy between possible and impossible physical transformations–those that can be caused to happen and those that cannot. This is a departure from the prevailing conception of fundamental physics which is to predict what will happen from initial conditions and laws of motion. Several converging motivations for expecting constructor theory to be a fundamental branch of physics are discussed. Some principles of the theory are suggested and its potential for solving various problems and achieving various unifications is explored. These include providing a theory of information underlying classical and quantum information; generalising the theory of computation to include all physical transformations; unifying formal statements of conservation laws with the stronger operational ones (such as the ruling-out of perpetual motion machines); expressing the principles of testability and of the computability of nature (currently deemed methodological and metaphysical respectively) as laws of physics; allowing exact statements of emergent laws (such as the second law of thermodynamics); and expressing certain apparently anthropocentric attributes such as knowledge in physical terms.; Constructor theory seeks to express all fundamental scientific theories in terms of a dichotomy between possible and impossible physical transformations-those that can be caused to happen and those that cannot. This is a departure from the prevailing conception of fundamental physics which is to predict what will happen from initial conditions and laws of motion. Several converging motivations for expecting Constructor theory to be a fundamental branch of physics are discussed. Some principles of the theory are suggested and its potential for solving various problems and achieving various unifications is explored. These include providing a theory of information underlying classical and quantum information; generalising the theory of computation to include all physical transformations; unifying formal statements of conservation laws with the stronger operational ones (such as the ruling-out of perpetual motion machines); expressing the principles of testability and of the computability of nature (currently deemed methodological and metaphysical respectively) as laws of physics; allowing exact statements of emergent laws (such as the second law of thermodynamics); and expressing certain apparently anthropocentric attributes such as knowledge in physical terms.[PUBLICATION ABSTRACT];
BibTeX:
@article{deutsch-constructor-2013,
  author = {Deutsch, David},
  title = {Constructor theory},
  journal = {Synthese},
  year = {2013},
  volume = {190},
  number = {18},
  pages = {4331--4359}
}
Shea, N. Consumers Need Information: Supplementing Teleosemantics with an Input Condition 2007 Philosophy and Phenomenological Research
Vol. 75(2), pp. 404-435 
article  
Abstract: The success of a piece of behaviour is often explained by its being caused by a true representation (similarly, failure falsity). In some simple organisms, success is just survival and reproduction. Scientists explain why a piece of behaviour helped the organism to survive and reproduce by adverting to the behaviour's having been caused by a true representation. That usage should, if possible, be vindicated by an adequate naturalistic theory of content. Teleosemantics cannot do so, when it is applied to simple representing systems (Godfrey-Smith 1996). Here it is argued that the teleosemantic approach to content should therefore be modified, not abandoned, at least for simple representing systems. The new 'infotel-semantics' adds an input condition to the output condition offered by teleosemantics, recognising that it is constitutive of content in a simple representing system that the tokening of a representation should correlate probabilistically with the obtaining of its specific evolutionary success condition.; The success of a piece of behaviour is often explained by its being caused by a true representation (similarly, failure falsity). In some simple organisms, success is just survival and reproduction. Scientists explain why a piece of behaviour helped the organism to survive and reproduce by adverting to the behaviour’s having been caused by a true representation. That usage should, if possible, be vindicated by an adequate naturalistic theory of content. Teleosemantics cannot do so, when it is applied to simple representing systems (Godfrey‐Smith 1996). Here it is argued that the teleosemantic approach to content should therefore be modified, not abandoned, at least for simple representing systems. The new ‘infotel‐semantics’ adds an input condition to the output condition offered by teleosemantics, recognising that it is constitutive of content in a simple representing system that the tokening of a representation should correlate probabilistically with the obtaining of its specific evolutionary success condition.; The success of a piece of behaviour is often explained by its being caused by a true representation (similarly, failure falsity). In some simple organisms, success is just survival and reproduction. Scientists explain why a piece of behaviour helped the organism to survive and reproduce by adverting to the behaviour's having been caused by a true representation. That usage should, if possible, be vindicated by an adequate naturalistic theory of content. Teleosemantics cannot do so, when it is applied to simple representing systems (Godfrey-Smith 1996). Here it is argued that the teleosemantic approach to content should therefore be modified, not abandoned, at least for simple representing systems. The new 'infotel-semantics' adds an input condition to the output condition offered by teleosemantics, recognising that it is constitutive of content in a simple representing system that the tokening of a representation should correlate probabilistically with the obtaining of its specific evolutionary success condition.
BibTeX:
@article{shea-consumers-2007,
  author = {Shea, Nicholas},
  title = {Consumers Need Information: Supplementing Teleosemantics with an Input Condition},
  journal = {Philosophy and Phenomenological Research},
  year = {2007},
  volume = {75},
  number = {2},
  pages = {404--435}
}
D'Agostino, F., Gaus, G. and Thrasher, J. Contemporary Approaches to the Social Contract 2014 The Stanford Encyclopedia of Philosophy  incollection URL 
BibTeX:
@incollection{dagostino-contemporary-2014,
  author = {D'Agostino, Fred and Gaus, Gerald and Thrasher, John},
  title = {Contemporary Approaches to the Social Contract},
  booktitle = {The Stanford Encyclopedia of Philosophy},
  publisher = {Metaphysics Research Lab, Stanford University},
  year = {2014},
  edition = {Spring 2014},
  url = {https://plato.stanford.edu/archives/spr2014/entries/contractarianism-contemporary/}
}
Tahko, T.E. Contemporary Aristotelian metaphysics 2012   book  
BibTeX:
@book{tahko-contemporary-2012,
  author = {Tahko, Tuomas E.},
  title = {Contemporary Aristotelian metaphysics},
  publisher = {Cambridge University Press},
  year = {2012}
}
Hitchcock, C. Contemporary debates in philosophy of science 2004
Vol. 2 
book  
BibTeX:
@book{hitchcock-contemporary-2004,
  author = {Hitchcock, Christopher},
  title = {Contemporary debates in philosophy of science},
  publisher = {Blackwell Pub},
  year = {2004},
  volume = {2}
}
Sterelny, K. Content, Control and Display: The Natural Origins of Content 2015 Philosophia
Vol. 43(3), pp. 549-564 
article  
Abstract: Hutto and Satne identify three research traditions attempting to explain the place of intentional agency in a wholly natural world: naturalistic reduction; sophisticated behaviourism, and pragmatism, and suggest that insights from all three are necessary. While agreeing with that general approach, I develop a somewhat different package, offering an outline of a vindicating genealogy of our interpretative practices. I suggest that these practices had their original foundation in the elaboration of much more complex representation-guided control structures in our lineage and the support and amplification of those control structures through external resources. Cranes (as Dennett calls them) became increasingly important in the explanation of systematically successful action. These more complex representational engines coevolved with selection to detect and respond to the control structures of others. Since much of that selection was driven by the advantages of cooperation and coordination, in part these control structures were co-opted as external signals and guarantees, in cooperation and coordination. As the time depth of cooperation and co-ordination extended, these public signals of belief and intent acquired secondary functions, as mechanisms to stabilise and structure control systems, making humans not just more transparent to one another at a time, but more predictable over time. Mindshaping, not just mindreading, became increasingly important in our lineage.;Hutto and Satne identify three research traditions attempting to explain the place of intentional agency in a wholly natural world: naturalistic reduction; sophisticated behaviourism, and pragmatism, and suggest that insights from all three are necessary. While agreeing with that general approach, I develop a somewhat different package, offering an outline of a vindicating genealogy of our interpretative practices. I suggest that these practices had their original foundation in the elaboration of much more complex representation-guided control structures in our lineage and the support and amplification of those control structures through external resources. Cranes (as Dennett calls them) became increasingly important in the explanation of systematically successful action. These more complex representational engines coevolved with selection to detect and respond to the control structures of others. Since much of that selection was driven by the advantages of cooperation and coordination, in part these control structures were co-opted as external signals and guarantees, in cooperation and coordination. As the time depth of cooperation and co-ordination extended, these public signals of belief and intent acquired secondary functions, as mechanisms to stabilise and structure control systems, making humans not just more transparent to one another at a time, but more predictable over time. Mindshaping, not just mindreading, became increasingly important in our lineage.;
BibTeX:
@article{sterelny-content-2015,
  author = {Sterelny, Kim},
  title = {Content, Control and Display: The Natural Origins of Content},
  journal = {Philosophia},
  year = {2015},
  volume = {43},
  number = {3},
  pages = {549--564}
}
Durham, I.T. Contextuality: Wheeler’s universal regulating principle 2015 It From Bit or Bit From It?, pp. 213-223  incollection  
BibTeX:
@incollection{durham-contextuality:-2015,
  author = {Durham, Ian T},
  title = {Contextuality: Wheeler’s universal regulating principle},
  booktitle = {It From Bit or Bit From It?},
  publisher = {Springer},
  year = {2015},
  pages = {213--223}
}
Lewis, D.K. Convention: A Philosophical Study 2013   book  
BibTeX:
@book{lewis-convention:-2013,
  author = {Lewis, David K.},
  title = {Convention: A Philosophical Study},
  publisher = {Wiley},
  year = {2013}
}
Votsis, I. Conventionalism 2008
Vol. 39(1) 
book  
BibTeX:
@book{votsis-conventionalism-2008,
  author = {Votsis, Ioannis},
  title = {Conventionalism},
  year = {2008},
  volume = {39},
  number = {1}
}
Hooghiemstra, R. Corporate Communication and Impression Management: New Perspectives Why Companies Engage in Corporate Social Reporting 2000 Journal of Business Ethics
Vol. 27(1/2), pp. 55-68 
article  
Abstract: This paper addresses the theoretical framework on corporate social reporting. Although that corporate social reporting has been analysed from different perspectives, legitmacy theory currently is the dominating perspective. Authors employing this framework suggest that social and environmental disclosures are responses to both public pressure and increased media attention resulting from major social incidents such as the Exxon Valdez oil spill and the chemical leak in Bhopal (India). More specifically, those authors argue that the increase in social disclosures represent a strategy to alter the public's perception about the legitimacy of the organisation. Therefore, we suggest using corporate communication as an overarching framework to study corporate social reporting in which "corporate image" and "corporate identity" are central.;This paper addresses the theoretical framework on corporate social reporting. Although that corporate social reporting has been analysed from different perspectives, legitmacy theory currently is the dominating perspective. Authors employing this framework suggest that social and environmental disclosures are responses to both public pressure and increased media attention resulting from major social incidents such as the Exxon Valdez oil spill and the chemical leak in Bhopal (India). More specifically, those authors argue that the increase in social disclosures represent a strategy to alter the public's perception about the legitimacy of the organisation. Therefore, we suggest using corporate communication as an overarching framework to study corporate social reporting in which “corporate image” and “corporate identity” are central.;This paper addresses the theoretical framework on corporate social reporting. Although that corporate social reporting has been analysed from different perspectives, legitmacy theory currently is the dominating perspective. Authors employing this framework suggest that social and environmental disclosures are responses to both public pressure and increased media attention resulting from major social incidents such as the Exxon Valdez oil spill and the chemical leak in Bhopal (India). More specifically, those authors argue that the increase in social disclosures represent a strategy to alter the public's perception about the legitimacy of the organisation. Therefore, we suggest using corporate communication as an overarching framework to study corporate social reporting in which "corporate image" and "corporate identity" are central.;This paper addresses the theoretical framework on corporate social reporting. Although that corporate social reporting has been analysed from different perspectives, legitmacy theory currently is the dominating perspective. Authors employing this framework suggest that social and environmental disclosures are responses to both public pressure and increased media attention resulting from major social incidents such as the Exxon Valdez oil spill and the chemical leak in Bhopal (India). More specifically, those authors argue that the increase in social disclosures represent a strategy to alter the public's perception about the legitimacy of the organisation. Therefore, we suggest using corporate communication as an overarching framework to study corporate social reporting in which "corporate image" and "corporate identity" are central.;This paper addresses the theoretical framework on corporate social reporting. Although that corporate social reporting has been analysed from different perspectives, legitmacy theory currently is the dominating perspective. Authors employing this framework suggest that social and environmental disclosures are responses to both public pressure and increased media attention resulting from major social incidents such as the Exxon Valdez oil spill and the chemical leak in Bhopal (India). More specifically, those authors argue that the increase in social disclosures represent a strategy to alter the public's perception about the legitimacy of the organisation. Therefore, we suggest using corporate communication as an overarching framework to study corporate social reporting in which "corporate image" and "corporate identity" are central.;
BibTeX:
@article{hooghiemstra-corporate-2000,
  author = {Hooghiemstra, Reggy},
  title = {Corporate Communication and Impression Management: New Perspectives Why Companies Engage in Corporate Social Reporting},
  journal = {Journal of Business Ethics},
  year = {2000},
  volume = {27},
  number = {1/2},
  pages = {55--68}
}
Higson, A. Corporate financial reporting: theory and practice 2003   book  
BibTeX:
@book{higson-corporate-2003,
  author = {Higson, Andrew},
  title = {Corporate financial reporting: theory and practice},
  publisher = {SAGE},
  year = {2003}
}
Morsing, M. and Schultz, M. Corporate social responsibility communication: stakeholder information, response and involvement strategies: 1 2006 Business Ethics
Vol. 15(4), pp. 323 
article  
Abstract: While it is generally agreed that companies need to manage their relationships with their stakeholders, the way in which they choose to do so varies considerably. In this paper, it is argued that when companies want to communicate with stakeholders about their CSR initiatives, they need to involve those stakeholders in a two-way communication process, defined as an ongoing iterative sense-giving and sense-making process. The paper also argues that companies need to communicate through carefully crafted and increasingly sophisticated processes. Three CSR communication strategies are developed. Based on empirical illustrations and prior research, the authors argue that managers need to move from 'informing' and 'responding' to 'involving' stakeholders in CSR communication itself. They conclude that managers need to expand the role of stakeholders in corporate CSR communication processes if they want to improve their efforts to build legitimacy, a positive reputation and lasting stakeholder relationships. [PUBLICATION ABSTRACT]
BibTeX:
@article{morsing-corporate-2006,
  author = {Morsing, Mette and Schultz, Majken},
  title = {Corporate social responsibility communication: stakeholder information, response and involvement strategies: 1},
  journal = {Business Ethics},
  year = {2006},
  volume = {15},
  number = {4},
  pages = {323}
}
Buonomano, D.V. and Merzenich, M.M. Cortical plasticity: from synapses to maps 1998 Annu Rev Neurosci
Vol. 21 
article DOI URL 
BibTeX:
@article{buonomano-cortical-1998,
  author = {Buonomano, D. V. and Merzenich, M. M.},
  title = {Cortical plasticity: from synapses to maps},
  journal = {Annu Rev Neurosci},
  year = {1998},
  volume = {21},
  url = {http://dx.doi.org/10.1146/annurev.neuro.21.1.149},
  doi = {http://doi.org/10.1146/annurev.neuro.21.1.149}
}
Hawking, S. Cosmology from the Top Down 2003   article  
Abstract: Cosmology from the Top Down.
BibTeX:
@article{hawking-cosmology-2003,
  author = {Hawking, Stephen},
  title = {Cosmology from the Top Down},
  year = {2003}
}
Gale, G. Cosmology: Methodological Debates in the 1930s and 1940s 2015 The Stanford Encyclopedia of Philosophy  incollection URL 
BibTeX:
@incollection{gale-cosmology:-2015,
  author = {Gale, George},
  title = {Cosmology: Methodological Debates in the 1930s and 1940s},
  booktitle = {The Stanford Encyclopedia of Philosophy},
  publisher = {Metaphysics Research Lab, Stanford University},
  year = {2015},
  edition = {Summer 2015},
  url = {https://plato.stanford.edu/archives/sum2015/entries/cosmology-30s/}
}
Lewis, D. Counterfactual Dependence and Time's Arrow 1979 Noûs
Vol. 13(4), pp. 455-476 
article  
BibTeX:
@article{lewis-counterfactual-1979,
  author = {Lewis, David},
  title = {Counterfactual Dependence and Time's Arrow},
  journal = {Noûs},
  year = {1979},
  volume = {13},
  number = {4},
  pages = {455--476}
}
Bloor, D. CRITICAL NOTICE OF IAN HACKING "THE SOCIAL CONSTRUCTION OF WHAT?" 2000 Canadian Journal of Philosophy
Vol. 30(4), pp. 597 
article  
BibTeX:
@article{bloor-critical-2000,
  author = {Bloor, David},
  title = {CRITICAL NOTICE OF IAN HACKING "THE SOCIAL CONSTRUCTION OF WHAT?"},
  journal = {Canadian Journal of Philosophy},
  year = {2000},
  volume = {30},
  number = {4},
  pages = {597}
}
Seising, R. Cybernetics, system(s) theory, information theory and Fuzzy Sets and Systems in the 1950s and 1960s 2010 Information Sciences
Vol. 180(23), pp. 4459-4476 
article  
BibTeX:
@article{seising-cybernetics-2010,
  author = {Seising, Rudolf},
  title = {Cybernetics, system(s) theory, information theory and Fuzzy Sets and Systems in the 1950s and 1960s},
  journal = {Information Sciences},
  year = {2010},
  volume = {180},
  number = {23},
  pages = {4459--4476}
}
Wiener, N. Cybernetics: or control and communication in the animal and the machine 1961   book  
BibTeX:
@book{wiener-cybernetics:-1961,
  author = {Wiener, Norbert},
  title = {Cybernetics: or control and communication in the animal and the machine},
  publisher = {M.I.T. Press},
  year = {1961},
  edition = {2nd}
}
Armstrong, D.M. and Bogdan, R.J. D.M. Armstrong 1984
Vol. 4. 
book  
BibTeX:
@book{armstrong-d.m.-1984,
  author = {Armstrong, D. M. and Bogdan, Radu J.},
  title = {D.M. Armstrong},
  publisher = {D. Reidel Pub. Co},
  year = {1984},
  volume = {4.}
}
Greene, B.R. D-brane topology changing transitions 1998 Nuclear Physics, Section B
Vol. 525(1), pp. 284-296 
article  
BibTeX:
@article{greene-d-brane-1998,
  author = {Greene, Brian R.},
  title = {D-brane topology changing transitions},
  journal = {Nuclear Physics, Section B},
  year = {1998},
  volume = {525},
  number = {1},
  pages = {284--296}
}
De Chirico, G. and Faerna, J.M. De Chirico 1995   book  
BibTeX:
@book{de-chirico-chirico-1995,
  author = {De Chirico, Giorgio and Faerna, José M.},
  title = {De Chirico},
  publisher = {Cameo/Abrams [i.e. Abrams/Cameo]},
  year = {1995}
}
Liu, M., Hua, Q.-x., Hu, S.-Q., Jia, W., Yang, Y., Saith, S.E., Whittaker, J., Arvan, P. and Weiss, M.A. Deciphering the hidden informational content of protein sequences: foldability of proinsulin hinges on a flexible arm that is dispensable in the mature hormone 2010 The Journal of biological chemistry
Vol. 285(40), pp. 30989-31001 
article  
Abstract: Protein sequences encode both structure and foldability. Whereas the interrelationship of sequence and structure has been extensively investigated, the origins of folding efficiency are enigmatic. We demonstrate that the folding of proinsulin requires a flexible N-terminal hydrophobic residue that is dispensable for the structure, activity, and stability of the mature hormone. This residue (Phe(B1) in placental mammals) is variably positioned within crystal structures and exhibits (1)H NMR motional narrowing in solution. Despite such flexibility, its deletion impaired insulin chain combination and led in cell culture to formation of non-native disulfide isomers with impaired secretion of the variant proinsulin. Cellular folding and secretion were maintained by hydrophobic substitutions at B1 but markedly perturbed by polar or charged side chains. We propose that, during folding, a hydrophobic side chain at B1 anchors transient long-range interactions by a flexible N-terminal arm (residues B1-B8) to mediate kinetic or thermodynamic partitioning among disulfide intermediates. Evidence for the overall contribution of the arm to folding was obtained by alanine scanning mutagenesis. Together, our findings demonstrate that efficient folding of proinsulin requires N-terminal sequences that are dispensable in the native state. Such arm-dependent folding can be abrogated by mutations associated with β-cell dysfunction and neonatal diabetes mellitus.;Protein sequences encode both structure and foldability. Whereas the interrelationship of sequence and structure has been extensively investigated, the origins of folding efficiency are enigmatic. We demonstrate that the folding of proinsulin requires a flexible N-terminal hydrophobic residue that is dispensable for the structure, activity, and stability of the mature hormone. This residue (Phe B1 in placental mammals) is variably positioned within crystal structures and exhibits 1 H NMR motional narrowing in solution. Despite such flexibility, its deletion impaired insulin chain combination and led in cell culture to formation of non-native disulfide isomers with impaired secretion of the variant proinsulin. Cellular folding and secretion were maintained by hydrophobic substitutions at B1 but markedly perturbed by polar or charged side chains. We propose that, during folding, a hydrophobic side chain at B1 anchors transient long-range interactions by a flexible N-terminal arm (residues B1–B8) to mediate kinetic or thermodynamic partitioning among disulfide intermediates. Evidence for the overall contribution of the arm to folding was obtained by alanine scanning mutagenesis. Together, our findings demonstrate that efficient folding of proinsulin requires N-terminal sequences that are dispensable in the native state. Such arm-dependent folding can be abrogated by mutations associated with β-cell dysfunction and neonatal diabetes mellitus.;
BibTeX:
@article{liu-deciphering-2010,
  author = {Liu, Ming and Hua, Qing-xin and Hu, Shi-Quan and Jia, Wenhua and Yang, Yanwu and Saith, Sunil E. and Whittaker, Jonathan and Arvan, Peter and Weiss, Michael A.},
  title = {Deciphering the hidden informational content of protein sequences: foldability of proinsulin hinges on a flexible arm that is dispensable in the mature hormone},
  journal = {The Journal of biological chemistry},
  year = {2010},
  volume = {285},
  number = {40},
  pages = {30989--31001}
}
Wurm, M.F. and Lingnau, A. Decoding actions at different levels of abstraction 2015 The Journal of neuroscience : the official journal of the Society for Neuroscience
Vol. 35(20), pp. 7727 
article  
Abstract: Brain regions that mediate action understanding must contain representations that are action specific and at the same time tolerate a wide range of perceptual variance. Whereas progress has been made in understanding such generalization mechanisms in the object domain, the neural mechanisms to conceptualize actions remain unknown. In particular, there is ongoing dissent between motor-centric and cognitive accounts whether premotor cortex or brain regions in closer relation to perceptual systems, i.e., lateral occipitotemporal cortex, contain neural populations with such mapping properties. To date, it is unclear to which degree action-specific representations in these brain regions generalize from concrete action instantiations to abstract action concepts. However, such information would be crucial to differentiate between motor and cognitive theories. Using ROI-based and searchlight-based fMRI multivoxel pattern decoding, we sought brain regions in human cortex that manage the balancing act between specificity and generality. We investigated a concrete level that distinguishes actions based on perceptual features (e.g., opening vs closing a specific bottle), an intermediate level that generalizes across movement kinematics and specific objects involved in the action (e.g., opening different bottles with cork or screw cap), and an abstract level that additionally generalizes across object category (e.g., opening bottles or boxes). We demonstrate that the inferior parietal and occipitotemporal cortex code actions at abstract levels whereas the premotor cortex codes actions at the concrete level only. Hence, occipitotemporal, but not premotor, regions fulfill the necessary criteria for action understanding. This result is compatible with cognitive theories but strongly undermines motor theories of action understanding.
BibTeX:
@article{wurm-decoding-2015,
  author = {Wurm, Moritz F. and Lingnau, Angelika},
  title = {Decoding actions at different levels of abstraction},
  journal = {The Journal of neuroscience : the official journal of the Society for Neuroscience},
  year = {2015},
  volume = {35},
  number = {20},
  pages = {7727}
}
Vedral, V. Decoding reality: the universe as quantum information 2010   book  
BibTeX:
@book{vedral-decoding-2010,
  author = {Vedral, Vlatko},
  title = {Decoding reality: the universe as quantum information},
  publisher = {Oxford University Press},
  year = {2010}
}
Zurek, W.H. Decoherence, einselection, and the quantum origins of the classical 2003 Reviews of Modern Physics
Vol. 75(3), pp. 715-775 
article  
BibTeX:
@article{zurek-decoherence-2003,
  author = {Zurek, Wojciech H.},
  title = {Decoherence, einselection, and the quantum origins of the classical},
  journal = {Reviews of Modern Physics},
  year = {2003},
  volume = {75},
  number = {3},
  pages = {715--775}
}
Halvorson, H. Deep beauty: understanding the quantum world through mathematical innovation 2011   book  
BibTeX:
@book{halvorson-deep-2011,
  author = {Halvorson, Hans},
  title = {Deep beauty: understanding the quantum world through mathematical innovation},
  publisher = {Cambridge University Press},
  year = {2011}
}
Batterman, R.W. Defining Chaos 1993 Philosophy of Science
Vol. 60(1), pp. 43-66 
article  
Abstract: This paper considers definitions of classical dynamical chaos that focus primarily on notions of predictability and computability, sometimes called algorithmic complexity definitions of chaos. I argue that accounts of this type are seriously flawed. They focus on a likely consequence of chaos, namely, randomness in behavior which gets characterized in terms of the unpredictability or uncomputability of final given initial states. In doing so, however, they can overlook the definitive feature of dynamical chaos-the fact that the underlying motion generating the behavior exhibits extreme trajectory instability. I formulate a simple criterion of adequacy for any definition of chaos and show how such accounts fail to satisfy it.; This paper considers definitions of classical dynamical chaos that focus primarily on notions of predictability and computability, sometimes called algorithmic complexity definitions of chaos. I argue that accounts of this type are seriously flawed. They focus on a likely consequence of chaos, namely, randomness in behavior which gets characterized in terms of the unpredictability or uncomputability of final given initial states. In doing so, however, they can overlook the definitive feature of dynamical chaos–the fact that the underlying motion generating the behavior exhibits extreme trajectory instability. I formulate a simple criterion of adequacy for any definition of chaos and show how such accounts fail to satisfy it.
BibTeX:
@article{batterman-defining-1993,
  author = {Batterman, Robert W.},
  title = {Defining Chaos},
  journal = {Philosophy of Science},
  year = {1993},
  volume = {60},
  number = {1},
  pages = {43--66}
}
Martelli, M., Dang, M. and Seph, T. Defining Chaos 1998 Mathematics Magazine
Vol. 71(2), pp. 112-122 
article  
Abstract: Many aspects of nature consist of "random" patterns which turn out to be completely predictable. Martelli, Dang, and Seph define these "chaotic systems" in a manner that is comprehensive to undergraduate students.; Many aspects of nature consist of "random" patterns which turn out to be completely predictable. Martelli, Dang, and Seph define these "chaotic systems" in a manner that is comprehensive to undergraduate students.
BibTeX:
@article{martelli-defining-1998,
  author = {Martelli, Mario and Dang, Mai and Seph, Tanya},
  title = {Defining Chaos},
  journal = {Mathematics Magazine},
  year = {1998},
  volume = {71},
  number = {2},
  pages = {112--122}
}
Ney, A. Defining Physicalism 2008
Vol. 3(5), pp. 1033-1048 
article  
BibTeX:
@article{ney-defining-2008,
  author = {Ney, Alyssa},
  title = {Defining Physicalism},
  year = {2008},
  volume = {3},
  number = {5},
  pages = {1033--1048}
}
Hawking, S., Maldacena, J. and Strominger, A. DeSitter entropy, quantum entanglement and ADS/CFT 2001 Journal of High Energy Physics
Vol. 2001(5), pp. 001-1 
article  
Abstract: A deSitter brane-world bounding regions of anti-deSitter space has a macroscopic entropy given by one-quarter the area of the observer horizon. A proposed variant of the AdS/CFT correspondence gives a dual description of this cosmology as conformal field theory coupled to gravity in deSitter space. In the case of two-dimensional deSitter space this provides a microscopic derivation of the entropy, including the one-quarter, as quantum entanglement of the conformal field theory across the horizon.;A de Sitter brane-world bounding regions of anti-de Sitter space has a macroscopic entropy given by one-quarter the area of the observer horizon. A proposed variant of the AdS/CFT correspondence gives a dual description of this cosmology as conformal field theory coupled to gravity in de Sitter space. In the case of two-dimensional de Sitter space this provides a microscopic derivation of the entropy, including the one-quarter, as quantum entanglement of the conformal field theory across the horizon.;
BibTeX:
@article{hawking-desitter-2001,
  author = {Hawking, Stephen and Maldacena, Juan and Strominger, Andrew},
  title = {DeSitter entropy, quantum entanglement and ADS/CFT},
  journal = {Journal of High Energy Physics},
  year = {2001},
  volume = {2001},
  number = {5},
  pages = {001--1}
}
Reines, F. and Cowan Jr, C.L. Detection of the free neutrino 1953 Physical Review
Vol. 92(3), pp. 830-831 
article  
BibTeX:
@article{reines-detection-1953,
  author = {Reines, F. and Cowan Jr, C. L.},
  title = {Detection of the free neutrino},
  journal = {Physical Review},
  year = {1953},
  volume = {92},
  number = {3},
  pages = {830--831}
}
Cowan, C.L., Reines, F., Harrison, F.B., Kruse, H.W. and McGuire, A.D. Detection of the Free Neutrino: A Confirmation 1956 Science
Vol. 124(3212), pp. 103-104 
article  
BibTeX:
@article{cowan-detection-1956,
  author = {Cowan, C. L. and Reines, F. and Harrison, F. B. and Kruse, H. W. and McGuire, A. D.},
  title = {Detection of the Free Neutrino: A Confirmation},
  journal = {Science},
  year = {1956},
  volume = {124},
  number = {3212},
  pages = {103--104}
}
Bedke, M.S. Developmental Process Reliabilism: on Justification, Defeat, and Evidence 2010 Erkenntnis (1975-)
Vol. 73(1), pp. 1-17 
article  
Abstract: Here I present and defend an etiological theory of objective, doxastic justification, and related theories of defeat and evidence. The theory is intended to solve a problem for reliabilist epistemologies—the problem of identifying relevant environments for assessing a process’s reliability. It is also intended to go some way to accommodating, neutralizing, or explaining away many internalist-friendly elements in our epistemic thinking.;Here I present and defend an etiological theory of objective, doxastic justification, and related theories of defeat and evidence. The theory is intended to solve a problem for reliabilist epistemologies–the problem of identifying relevant environments for assessing a process's reliability. It is also intended to go some way to accommodating, neutralizing, or explaining away many internalist-friendly elements in our epistemic thinking.[PUBLICATION ABSTRACT];Here I present and defend an etiological theory of objective, doxastic justification, and related theories of defeat and evidence. The theory is intended to solve a problem for reliabilist epistemologies— the problem of identifying relevant environments for assessing a process's reliability. It is also intended to go some way to accommodating, neutralizing, or explaining away many internalist-friendly elements in our epistemic thinking.;Here I present and defend an etiological theory of objective, doxastic justification, and related theories of defeat and evidence. The theory is intended to solve a problem for reliabilist epistemologies–the problem of identifying relevant environments for assessing a process's reliability. It is also intended to go some way to accommodating, neutralizing, or explaining away many internalist-friendly elements in our epistemic thinking.;
BibTeX:
@article{bedke-developmental-2010,
  author = {Bedke, Matthew S.},
  title = {Developmental Process Reliabilism: on Justification, Defeat, and Evidence},
  journal = {Erkenntnis (1975-)},
  year = {2010},
  volume = {73},
  number = {1},
  pages = {1--17}
}
Griffiths, P.E. and Gray, R.D. Developmental Systems and Evolutionary Explanation 1994 The Journal of Philosophy
Vol. 91(6), pp. 277-304 
article  
BibTeX:
@article{griffiths-developmental-1994,
  author = {Griffiths, P. E. and Gray, R. D.},
  title = {Developmental Systems and Evolutionary Explanation},
  journal = {The Journal of Philosophy},
  year = {1994},
  volume = {91},
  number = {6},
  pages = {277--304}
}
Shea, N. Developmental Systems Theory Formulated as a Claim about Inherited Representations 2011 Philosophy of Science
Vol. 78(1), pp. 60-82 
article  
Abstract: Developmental systems theory (DST) is often dismissed on the basis that the causal indispensability of nongenetic factors in evolution and development has long been appreciated. A reformulation makes a more substantive claim: that the special role played by genes is also played by some (but not all) nongenetic resources. That special role can be captured by Shea's 'inherited representation'. Formulating DST as the claim that there are nongenetic inherited representations turns it into a striking, empirically testable hypothesis. DST's characteristic rejection of a gene versus environment dichotomy is preserved but without dissolving into an interactionist casual soup, as some have alleged.; Developmental systems theory (DST) is often dismissed on the basis that the causal indispensability of nongenetic factors in evolution and development has long been appreciated. A reformulation makes a more substantive claim: that the special role played by genes is also played by some (but not all) nongenetic resources. That special role can be captured by Shea’s ‘inherited representation’. Formulating DST as the claim that there are nongenetic inherited representations turns it into a striking, empirically testable hypothesis. DST’s characteristic rejection of a gene versus environment dichotomy is preserved but without dissolving into an interactionist casual soup, as some have alleged.; Developmental systems theory (DST) is often dismissed on the basis that the causal indispensability of nongenetic factors in evolution and development has long been appreciated. A reformulation makes a more substantive claim: that the special role played by genes is also played by some (but not all) nongenetic resources. That special role can be captured by Shea's 'inherited representation'. Formulating DST as the claim that there are nongenetic inherited representations turns it into a striking, empirically testable hypothesis. DST's characteristic rejection of a gene versus environment dichotomy is preserved but without dissolving into an interactionist casual soup, as some have alleged. [PUBLICATION ABSTRACT]; Developmental systems theory (DST) is often dismissed on the basis that the causal indispensability of nongenetic factors in evolution and development has long been appreciated. A reformulation makes a more substantive claim: that the special role played by genes is also played by some (but not all) nongenetic resources. That special role can be captured by Shea's 'inherited representation'. Formulating DST as the claim that there are nongenetic inherited representations turns it into a striking, empirically testable hypothesis. DST's characteristic rejection of a gene versus environment dichotomy is preserved but without dissolving into an interactionist casual soup, as some have alleged.
BibTeX:
@article{shea-developmental-2011,
  author = {Shea, Nicholas},
  title = {Developmental Systems Theory Formulated as a Claim about Inherited Representations},
  journal = {Philosophy of Science},
  year = {2011},
  volume = {78},
  number = {1},
  pages = {60--82}
}
Anonymous Digital Signal Processing 2015   book  
BibTeX:
@book{anonymous-digital-2015,
  author = {Anonymous},
  title = {Digital Signal Processing},
  publisher = {Oxford University Press},
  year = {2015}
}
Bueno, O. Dirac and the dispensability of mathematics 2005 Studies in History and Philosophy of Modern Physics
Vol. 36(3), pp. 465-490 
article  
Abstract: In this paper, I examine the role of the delta function in Dirac's formulation of quantum mechanics (QM), and I discuss, more generally, the role of mathematics in theory construction. It has been argued that mathematical theories play an indispensable role in physics, particularly in QM [Colyvan, M. (2001). The indispensability of mathematics. Oxford University Press: Oxford]. As I argue here, at least in the case of the delta function, Dirac was very clear about its dispensability. I first discuss the significance of the delta function in Dirac's work, and explore the strategy that he devised to overcome its use. I then argue that even if mathematical theories turned out to be indispensable, this wouldn't justify the commitment to the existence of mathematical entities. In fact, even in successful uses of mathematics, such as in Dirac's discovery of antimatter, there's no need to believe in the existence of the corresponding mathematical entities. An interesting picture about the application of mathematics emerges from a careful examination of Dirac's work.;In this paper, I examine the role of the delta function in Dirac's formulation of quantum mechanics (QM), and I discuss, more generally, the role of mathematics in theory construction. It has been argued that mathematical theories play an indispensable role in physics, particularly in QM [Colyvan, M. (2001). The indispensabilitj, of mathematics. Oxford University Press: Oxford]. As I argue here, at least in the case of the delta function, Dirac was very clear about its dispensahiliq% I first discuss the significance of the delta function in Dirac's work, and explore the strategy that he devised to overcome its use. I then argue that even if mathematical theories turned out to be indispensable, this wouldn't justify the commitment to the existence of mathematical entities. In fact, even in successful uses of mathematics, such as in Dirac's discovery of antimatter, there's no need to believe in the existence of the corresponding mathematical entities. An interesting picture about the application of mathematics emerges from a careful examination of Dirac's work. (c) 2005 Elsevier Ltd. All rights reserved.;In this paper, I examine the role of the delta function in Dirac's formulation of quantum mechanics (QM), and I discuss, more generally, the role of mathematics in theory construction. It has been argued that mathematical theories play an indispensable role in physics, particularly in QM [Colyvan, M. (2001). The indispensability of mathematics. Oxford University Press: Oxford]. As I argue here, at least in the case of the delta function, Dirac was very clear about its dispensability. I first discuss the significance of the delta function in Dirac's work, and explore the strategy that he devised to overcome its use. I then argue that even if mathematical theories turned out to be indispensable, this wouldn't justify the commitment to the existence of mathematical entities. In fact, even in successful uses of mathematics, such as in Dirac's discovery of antimatter, there's no need to believe in the existence of the corresponding mathematical entities. An interesting picture about the application of mathematics emerges from a careful examination of Dirac's work. © 2005 Elsevier Ltd. All rights reserved.;
BibTeX:
@article{bueno-dirac-2005,
  author = {Bueno, Otávio},
  title = {Dirac and the dispensability of mathematics},
  journal = {Studies in History and Philosophy of Modern Physics},
  year = {2005},
  volume = {36},
  number = {3},
  pages = {465--490}
}
Perinotti, P. Discord and Nonclassicality in Probabilistic Theories 2012 Phys. Rev. Lett.
Vol. 108(12), pp. 120502 
article DOI URL 
BibTeX:
@article{perinotti-discord-2012,
  author = {Perinotti, Paolo},
  title = {Discord and Nonclassicality in Probabilistic Theories},
  journal = {Phys. Rev. Lett.},
  year = {2012},
  volume = {108},
  number = {12},
  pages = {120502},
  url = {http://link.aps.org/doi/10.1103/PhysRevLett.108.120502},
  doi = {http://doi.org/10.1103/PhysRevLett.108.120502}
}
Rousseau, J.-J. and Cress, D.A. Discourse on the origin of inequality 1992   book  
BibTeX:
@book{rousseau-discourse-1992,
  author = {Rousseau, Jean-Jacques and Cress, Donald A.},
  title = {Discourse on the origin of inequality},
  publisher = {Hackett Pub. Co},
  year = {1992}
}
Clifford, J. and Adami, C. Discovery and information-theoretic characterization of transcription factor binding sites that act cooperatively 2015 PHYSICAL BIOLOGY
Vol. 12(5), pp. 056004 
article  
Abstract: Transcription factor binding to the surface of DNA regulatory regions is one of the primary causes of regulating gene expression levels. A probabilistic approach to model protein-DNA interactions at the sequence level is through Position Weight Matrices (PWMs) that estimate the joint probability of a DNA binding site sequence by assuming positional independence within the DNA sequence. Here we construct conditional PWMs that depend on the motif signatures in the flanking DNA sequence, by conditioning known binding site loci on the presence or absence of additional binding sites in the flanking sequence of each site's locus. Pooling known sites with similar flanking sequence patterns allows for the estimation of the conditional distribution function over the binding site sequences. We apply our model to the Dorsal transcription factor binding sites active in patterning the Dorsal-Ventral axis of Drosophila development. We find that those binding sites that cooperate with nearby Twist sites on average contain about 0.5 bits of information about the presence of Twist transcription factor binding sites in the flanking sequence. We also find that Dorsal binding site detectors conditioned on flanking sequence information make better predictions about what is a Dorsal site relative to background DNA than detection without information about flanking sequence features.;Transcription factor binding to the surface of DNA regulatory regions is one of the primary causes of regulating gene expression levels. A probabilistic approach to model protein-DNA interactions at the sequence level is through position weight matrices (PWMs) that estimate the joint probability of a DNA binding site sequence by assuming positional independence within the DNA sequence. Here we construct conditional PWMs that depend on the motif signatures in the flanking DNA sequence, by conditioning known binding site loci on the presence or absence of additional binding sites in the flanking sequence of each site's locus. Pooling known sites with similar flanking sequence patterns allows for the estimation of the conditional distribution function over the binding site sequences. We apply our model to the Dorsal transcription factor binding sites active in patterning the Dorsal-Ventral axis of Drosophila development. We find that those binding sites that cooperate with nearby Twist sites on average contain about 0.5 bits of information about the presence of Twist transcription factor binding sites in the flanking sequence. We also find that Dorsal binding site detectors conditioned on flanking sequence information make better predictions about what is a Dorsal site relative to background DNA than detection without information about flanking sequence features.;Transcription factor binding to the surface of DNA regulatory regions is one of the primary causes of regulating gene expression levels. A probabilistic approach to model protein-DNA interactions at the sequence level is through position weight matrices (PWMs) that estimate the joint probability of a DNA binding site sequence by assuming positional independence within the DNA sequence. Here we construct conditional PWMs that depend on the motif signatures in the flanking DNA sequence, by conditioning known binding site loci on the presence or absence of additional binding sites in the flanking sequence of each site's locus. Pooling known sites with similar flanking sequence patterns allows for the estimation of the conditional distribution function over the binding site sequences. We apply our model to the Dorsal transcription factor binding sites active in patterning the Dorsal-Ventral axis of Drosophila development. We find that those binding sites that cooperate with nearby Twist sites on average contain about 0.5 bits of information about the presence of Twist transcription factor binding sites in the flanking sequence. We also find that Dorsal binding site detectors conditioned on flanking sequence information make better predictions about what is a Dorsal site relative to background DNA than detection without information about flanking sequence features.;
BibTeX:
@article{clifford-discovery-2015,
  author = {Clifford, J. and Adami, C.},
  title = {Discovery and information-theoretic characterization of transcription factor binding sites that act cooperatively},
  journal = {PHYSICAL BIOLOGY},
  year = {2015},
  volume = {12},
  number = {5},
  pages = {056004}
}
Lyakhov, I.G., Krishnamachari, A. and Schneider, T.D. Discovery of novel tumor suppressor p53 response elements using information theory 2008 Nucleic acids research
Vol. 36(11), pp. 3828-3833 
article  
Abstract: An accurate method for locating genes under tumor suppressor p53 control that is based on a well-established mathematical theory and built using naturally occurring, experimentally proven p53 sites is essential in understanding the complete p53 network. We used a molecular information theory approach to create a flexible model for p53 binding. By searching around transcription start sites in human chromosomes 1 and 2, we predicted 16 novel p53 binding sites and experimentally demonstrated that 15 of the 16 (94 sites were bound by p53. Some were also bound by the related proteins p63 and p73. Thirteen of the adjacent genes were controlled by at least one of the proteins. Eleven of the 16 sites (69 had not been identified previously. This molecular information theory approach can be extended to any genetic system to predict new sites for DNA-binding proteins.; An accurate method for locating genes under tumor suppressor p53 control that is based on a well-established mathematical theory and built using naturally occurring, experimentally proven p53 sites is essential in understanding the complete p53 network. We used a molecular information theory approach to create a flexible model for p53 binding. By searching around transcription start sites in human chromosomes 1 and 2, we predicted 16 novel p53 binding sites and experimentally demonstrated that 15 of the 16 (94 sites were bound by p53. Some were also bound by the related proteins p63 and p73. Thirteen of the adjacent genes were controlled by at least one of the proteins. Eleven of the 16 sites (69 had not been identified previously. This molecular information theory approach can be extended to any genetic system to predict new sites for DNA-binding proteins.; An accurate method for locating genes under tumor suppressor p53 control that is based on a well-established mathematical theory and built using naturally occurring, experimentally proven p53 sites is essential in understanding the complete p53 network. We used a molecular information theory approach to create a flexible model for p53 binding. By searching around transcription start sites in human chromosomes 1 and 2, we predicted 16 novel p53 binding sites and experimentally demonstrated that 15 of the 16 (94 sites were bound by p53. Some were also bound by the related proteins p63 and p73. Thirteen of the adjacent genes were controlled by at least one of the proteins. Eleven of the 16 sites (69 had not been identified previously. This molecular information theory approach can be extended to any genetic system to predict new sites for DNA-binding proteins.; An accurate method for locating genes under tumor suppressor p53 control that is based on a well-established mathematical theory and built using naturally occurring, experimentally proven p53 sites is essential in understanding the complete p53 network. We used a molecular information theory approach to create a flexible model for p53 binding. By searching around transcription start sites in human chromosomes 1 and 2, we predicted 16 novel p53 binding sites and experimentally demonstrated that 15 of the 16 (94 sites were bound by p53. Some were also bound by the related proteins p63 and p73. Thirteen of the adjacent genes were controlled by at least one of the proteins. Eleven of the 16 sites (69 had not been identified previously. This molecular information theory approach can be extended to any genetic system to predict new sites for DNA-binding proteins.; An accurate method for locating genes under tumor suppressor p53 control that is based on a well-established mathematical theory and built using naturally occurring, experimentally proven p53 sites is essential in understanding the complete p53 network. We used a molecular information theory approach to create a flexible model for p53 binding. By searching around transcription start sites in human chromosomes 1 and 2, we predicted 16 novel p53 binding sites and experimentally demonstrated that 15 of the 16 (94 sites were bound by p53. Some were also bound by the related proteins p63 and p73. Thirteen of the adjacent genes were controlled by at least one of the proteins. Eleven of the 16 sites (69 had not been identified previously. This molecular information theory approach can be extended to any genetic system to predict new sites for DNA-binding proteins.
BibTeX:
@article{lyakhov-discovery-2008,
  author = {Lyakhov, Ilya G. and Krishnamachari, Annangarachari and Schneider, Thomas D.},
  title = {Discovery of novel tumor suppressor p53 response elements using information theory},
  journal = {Nucleic acids research},
  year = {2008},
  volume = {36},
  number = {11},
  pages = {3828--3833}
}
Goldman, A.I. Discrimination and Perceptual Knowledge 1976 The Journal of Philosophy
Vol. 73(20), pp. 771-791 
article  
BibTeX:
@article{goldman-discrimination-1976,
  author = {Goldman, Alvin I.},
  title = {Discrimination and Perceptual Knowledge},
  journal = {The Journal of Philosophy},
  year = {1976},
  volume = {73},
  number = {20},
  pages = {771--791}
}
Griffiths, P.E. and Gray, R.D. Discussion: Three Ways to Misunderstand Developmental Systems Theory 2005 Biology & Philosophy
Vol. 20(2), pp. 417-425 
article  
Abstract: Developmental systems theory (DST) is a general theoretical perspective on development, heredity and evolution. It is intended to facilitate the study of interactions between the many factors that influence development without reviving `dichotomous' debates over nature or nurture, gene or environment, biology or culture. Several recent papers have addressed the relationship between DST and the thriving new discipline of evolutionary developmental biology (EDB). The contributions to this literature by evolutionary developmental biologists contain three important misunderstandings of DST.; Developmental systems theory (DST) is a general theoretical perspective on development, heredity and evolution. It is intended to facilitate the study of interactions between the many factors that influence development without reviving `dichotomous' debates over nature or nurture, gene or environment, biology or culture. Several recent papers have addressed the relationship between DST and the thriving new discipline of evolutionary developmental biology (EDB). The contributions to this literature by evolutionary developmental biologists contain three important misunderstandings of DST.[PUBLICATION ABSTRACT]; Developmental systems theory (DST) is a general theoretical perspective on development, heredity and evolution. It is intended to facilitate the study of interactions between the many factors that influence development without reviving `dichotomous' debates over nature or nurture, gene or environment, biology or culture. Several recent papers have addressed the relationship between DST and the thriving new discipline of evolutionary developmental biology (EDB). The contributions to this literature by evolutionary developmental biologists contain three important misunderstandings of DST.
BibTeX:
@article{griffiths-discussion:-2005,
  author = {Griffiths, Paul E. and Gray, Russell D.},
  title = {Discussion: Three Ways to Misunderstand Developmental Systems Theory},
  journal = {Biology & Philosophy},
  year = {2005},
  volume = {20},
  number = {2},
  pages = {417--425}
}
Handfield, T. Dispositions and causes 2009   book  
BibTeX:
@book{handfield-dispositions-2009,
  author = {Handfield, Toby},
  title = {Dispositions and causes},
  publisher = {Clarendon Press},
  year = {2009}
}
Joshi, P.S., Malafarina, D. and Narayan, R. Distinguishing black holes from naked singularities through their accretion disk properties 2013   article  
Abstract: We show that, in principle, a slowly evolving gravitationally collapsing perfect fluid cloud can asymptotically settle to a static spherically symmetric equilibrium configuration with a naked singularity at the center. We consider one such asymptotic final configuration with a finite outer radius, and construct a toy model in which it is matched to a Schwarzschild exterior geometry. We examine the properties of circular orbits in this model. We then investigate observational signatures of a thermal accretion disk in this spacetime, comparing them with the signatures expected for a disk around a black hole of the same mass. Several notable differences emerge. A disk around the naked singularity is much more luminous than one around an equivalent black hole. Also, the disk around the naked singularity has a spectrum with a high frequency power law segment that carries a major fraction of the total luminosity. Thus, at least some naked singularities can, in principle, be distinguished observationally from black holes of the same mass. We discuss possible implications of these results.
BibTeX:
@article{joshi-distinguishing-2013,
  author = {Joshi, Pankaj S. and Malafarina, Daniele and Narayan, Ramesh},
  title = {Distinguishing black holes from naked singularities through their accretion disk properties},
  year = {2013}
}
Gray, R.M. Distortion and Approximation 2011 Entropy and Information Theory, pp. 117-146  incollection  
BibTeX:
@incollection{gray-distortion-2011,
  author = {Gray, Robert M.},
  title = {Distortion and Approximation},
  booktitle = {Entropy and Information Theory},
  publisher = {Springer US},
  year = {2011},
  pages = {117--146}
}
Gray, R.M. Distortion and Entropy 2011 Entropy and Information Theory, pp. 147-171  incollection  
BibTeX:
@incollection{gray-distortion-2011-1,
  author = {Gray, Robert M.},
  title = {Distortion and Entropy},
  booktitle = {Entropy and Information Theory},
  publisher = {Springer US},
  year = {2011},
  pages = {147--171}
}
Gray, R.M. Distortion and Information 2011 Entropy and Information Theory, pp. 237-263  incollection  
BibTeX:
@incollection{gray-distortion-2011-2,
  author = {Gray, Robert M.},
  title = {Distortion and Information},
  booktitle = {Entropy and Information Theory},
  publisher = {Springer US},
  year = {2011},
  pages = {237--263}
}
Mamathambika, B.S. and Bardwell, J.C. Disulfide-linked protein folding pathways 2008 Annual review of cell and developmental biology
Vol. 24(1), pp. 211-235 
article  
Abstract: Determining the mechanism by which proteins attain their native structure is an important but difficult problem in basic biology. The study of protein folding is difficult because it involves the identification and characterization of folding intermediates that are only very transiently present. Disulfide bond formation is thermodynamically linked to protein folding. The availability of thiol trapping reagents and the relatively slow kinetics of disulfide bond formation have facilitated the isolation, purification, and characterization of disulfide-linked folding intermediates. As a result, the folding pathways of several disulfide-rich proteins are among the best known of any protein. This review discusses disulfide bond formation and its relationship to protein folding in vitro and in vivo.
BibTeX:
@article{mamathambika-disulfide-linked-2008,
  author = {Mamathambika, Bharath S. and Bardwell, James C.},
  title = {Disulfide-linked protein folding pathways},
  journal = {Annual review of cell and developmental biology},
  year = {2008},
  volume = {24},
  number = {1},
  pages = {211--235}
}
Stegmann, U.E. DNA, Inference, and Information 2009 The British Journal for the Philosophy of Science
Vol. 60(1), pp. 1-17 
article  
Abstract: This paper assesses Sarkar's ([2003]) deflationary account of genetic information. On Sarkar's account, genes carry information about proteins because protein synthesis exemplifies what Sarkar calls a 'formal information system'. Furthermore, genes are informationally privileged over non-genetic factors of development because only genes enter into arbitrary relations to their products (in virtue of the alleged arbitrariness of the genetic code). I argue that the deflationary theory does not capture four essential features of the ordinary concept of genetic information: intentionality, exclusiveness, asymmetry, and causal relevance. It is therefore further removed from what is customarily meant by genetic information than Sarkar admits. Moreover, I argue that it is questionable whether the account succeeds in demonstrating that information is theoretically useful in molecular genetics. [PUBLICATION ABSTRACT]; This paper assesses Sarkar's ([2003]) deflationary account of genetic information. On Sarkar's account, genes carry information about proteins because protein synthesis exemplifies what Sarkar calls a 'formal information system'. Furthermore, genes are informationally privileged over non-genetic factors of development because only genes enter into arbitrary relations to their products (in virtue of the alleged arbitrariness of the genetic code). I argue that the deflationary theory does not capture four essential features of the ordinary concept of genetic information: intentionality, exclusiveness, asymmetry, and causal relevance. It is therefore further removed from what is customarily meant by genetic information than Sarkar admits. Moreover, I argue that it is questionable whether the account succeeds in demonstrating that information is theoretically useful in molecular genetics. [PUBLICATION ABSTRACT];This paper assesses Sarkar's ([2003]) deflationary account of genetic information. On Sarkar's account, genes carry information about proteins because protein synthesis exemplifies what Sarkar calls a 'formal information system'. Furthermore, genes are informationally privileged over non-genetic factors of development because only genes enter into arbitrary relations to their products (in virtue of the alleged arbitrariness of the genetic code). I argue that the deflationary theory does not capture four essential features of the ordinary concept of genetic information: intentionality, exclusiveness, asymmetry, and causal relevance. It is therefore further removed from what is customarily meant by genetic information than Sarkar admits. Moreover, I argue that it is questionable whether the account succeeds in demonstrating that information is theoretically useful in molecular genetics. Introduction Sarkar's Information System The Pre-theoretic Features of Genetic Information 3.1 Intentionality 3.2 Exclusiveness 3.3 Asymmetry 3.4 Causal relevance Theoretical Usefulness Conclusion;
BibTeX:
@article{stegmann-dna-2009,
  author = {Stegmann, Ulrich E.},
  title = {DNA, Inference, and Information},
  journal = {The British Journal for the Philosophy of Science},
  year = {2009},
  volume = {60},
  number = {1},
  pages = {1--17}
}
Chittka, L. and Walker, J. Do bees like Van Gogh's Sunflowers? 2006 Optics and Laser Technology
Vol. 38(4), pp. 323-328 
article  
Abstract: Flower colours have evolved over 100 million years to address the colour vision of their bee pollinators. In a much more rapid process, cultural (and horticultural) evolution has produced images of flowers that stimulate aesthetic responses in human observers. The colour vision and analysis of visual patterns differ in several respects between humans and bees. Here, a behavioural ecologist and an installation artist present bumblebees with reproductions of paintings highly appreciated in Western society, such as Van Gogh's Sunflowers. We use this unconventional approach in the hope to raise awareness for between-species differences in visual perception, and to provoke thinking about the implications of biology in human aesthetics and the relationship between object representation and its biological connotations.; Flower colours have evolved over 100 million years to address the colour vision of their bee pollinators. In a much more rapid process, cultural (and horticultural) evolution has produced images of flowers that stimulate aesthetic responses in human observers. The colour vision and analysis of visual patterns differ in several respects between humans and bees. Here, a behavioural ecologist and an installation artist present bumblebees with reproductions of paintings highly appreciated in Western society, such as Van Gogh's Sunflowers. We use this unconventional approach in the hope to raise awareness for between-species differences in visual perception, and to provoke thinking about the implications of biology in human aesthetics and the relationship between object representation and its biological connotations. © 2005 Elsevier Ltd. All rights reserved.; Flower colours have evolved over 100 million years to address the colour vision of their bee pollinators. In a much more rapid process, cultural (and horticultural) evolution has produced images of flowers that stimulate aesthetic responses in human observers. The colour vision and analysis of visual patterns differ in several respects between humans and bees. Here, a behavioural ecologist and an installation artist present bumblebees with reproductions of paintings highly appreciated in Western society, such as Van Gogh's Sunflowers. We use this unconventional approach in the hope to raise awareness for between-species differences in visual perception, and to provoke thinking about the implications of biology in human aesthetics and the relationship between object representation and its biological connotations. (c) 2005 Elsevier Ltd. All rights reserved.
BibTeX:
@article{chittka-bees-2006,
  author = {Chittka, Lars and Walker, Julian},
  title = {Do bees like Van Gogh's Sunflowers?},
  journal = {Optics and Laser Technology},
  year = {2006},
  volume = {38},
  number = {4},
  pages = {323--328}
}
Hacking, I. Do We See Through a Microscope? 1981 Pacific Philosophical Quarterly
Vol. 62(4), pp. 305 
article  
BibTeX:
@article{hacking-we-1981,
  author = {Hacking, Ian},
  title = {Do We See Through a Microscope?},
  journal = {Pacific Philosophical Quarterly},
  year = {1981},
  volume = {62},
  number = {4},
  pages = {305}
}
Beebee, H. Does Anything Hold the Universe Together? 2006 Synthese
Vol. 149(3), pp. 509-533 
article  
BibTeX:
@article{beebee-does-2006,
  author = {Beebee, Helen},
  title = {Does Anything Hold the Universe Together?},
  journal = {Synthese},
  year = {2006},
  volume = {149},
  number = {3},
  pages = {509--533}
}
Baker, A. Does the Existence of Mathematical Objects Make a Difference? 2003 Australasian Journal of Philosophy
Vol. 81(2), pp. 246-264 
article  
Abstract: In this paper I examine a strategy which aims to bypass the technicalities of the indispensability debate and to offer a direct route to nominalism. The starting-point for this alternative nominalist strategy is the claim that–according to the platonist picture–the existence of mathematical objects makes no difference to the concrete, physical world. My principal goal is to show that the 'Makes No Difference' (MND) Argument does not succeed in undermining platonism. The basic reason why not is that the makes-no-difference claim which the argument is based on is problematic. Arguments both for and against this claim can be found in the literature; I examine three such arguments, uncovering flaws in each one. In the second half of the paper, I take a more direct approach and present an analysis of the counterfactual which underpins the makes-no-difference claim. What this analysis reveals is that indispensability considerations are in fact crucial to the proper evaluation of the MND Argument, contrary to the claims of its supporters.; In this paper I examine a strategy which aims to bypass the technicalities of the indispensability debate and to offer a direct route to nominalism. The starting-point for this alternative nominalist strategy is the claim that–according to the platonist picture–the existence of mathematical objects makes no difference to the concrete, physical world. My principal goal is to show that the 'Makes No Difference' (MND) Argument does not succeed in undermining platonism. The basic reason why not is that the makes-no-difference claim which the argument is based on is problematic. Arguments both for and against this claim can be found in the literature; I examine three such arguments, uncovering flaws in each one. In the second half of the paper, I take a more direct approach and present an analysis of the counterfactual which underpins the makes-no-difference claim. What this analysis reveals is that indispensability considerations are in fact crucial to the proper evaluation of the MND Argument, contrary to the claims of its supporters.; In this paper I examine a strategy which aims to bypass the technicalities of the indispensability debate and to offer a direct route to nominalism. The starting-point for this alternative nominalist strategy is the claim that–according to the platonist picture–the existence of mathematical objects makes no difference to the concrete, physical world. My principal goal is to show that the 'Makes No Difference' (MND) Argument does not succeed in undermining platonism. The basic reason why not is that the makes-no-difference claim which the argument is based on is problematic. Arguments both for and against this claim can be found in the literature; I examine three such arguments, uncovering flaws in each one. In the second half of the paper, I take a more direct approach and present an analysis of the counterfactual which underpins the makes-no-difference claim. What this analysis reveals is that indispensability considerations are in fact crucial to the proper evaluation of the MND Argument, contrary to the claims of its supporters.
BibTeX:
@article{baker-does-2003,
  author = {Baker, A.},
  title = {Does the Existence of Mathematical Objects Make a Difference?},
  journal = {Australasian Journal of Philosophy},
  year = {2003},
  volume = {81},
  number = {2},
  pages = {246--264}
}
Sarkar, S. Does “Information” Provide a Compelling Framework for a Theory of Natural Selection? Grounds for Caution 2014 Philosophy of Science
Vol. 81(1), pp. 22-30 
article  
Abstract: Frank has recently argued for an information-theoretic interpretation of natural selection. This interpretation is based on the identification of a measure related to the Malthusian parameter ((for population change)) with the Jeffreys divergence between the present allelic distribution of the population and that distribution in the next generation. It is pointed out in this analysis that this identification only holds if the mean fitness of the population is a constant, that is, there is no selection. This problem is used to argue for the superiority of the standard dynamical interpretation of natural selection over its information-theoretic counterpart.; Frank has recently argued for an information-theoretic interpretation of natural selection. This interpretation is based on the identification of a measure related to the Malthusian parameter (for population change) with the Jeffreys divergence between the present allelic distribution of the population and that distribution in the next generation. It is pointed out in this analysis that this identification only holds if the mean fitness of the population is a constant, that is, there is no selection. This problem is used to argue for the superiority of the standard dynamical interpretation of natural selection over its information-theoretic counterpart. [PUBLICATION ABSTRACT]; Frank has recently argued for an information-theoretic interpretation of natural selection. This interpretation is based on the identification of a measure related to the Malthusian parameter ((for population change)) with the Jeffreys divergence between the present allelic distribution of the population and that distribution in the next generation. It is pointed out in this analysis that this identification only holds if the mean fitness of the population is a constant, that is, there is no selection. This problem is used to argue for the superiority of the standard dynamical interpretation of natural selection over its information-theoretic counterpart.; Frank has recently argued for an information-theoretic interpretation of natural selection. This interpretation is based on the identification of a measure related to the Malthusian parameter (for population change) with the Jeffreys divergence between the present allelic distribution of the population and that distribution in the next generation. It is pointed out in this analysis that this identification only holds if the mean fitness of the population is a constant, that is, there is no selection. This problem is used to argue for the superiority of the standard dynamical interpretation of natural selection over its information-theoretic counterpart.
BibTeX:
@article{sarkar-does-2014,
  author = {Sarkar, Sahotra},
  title = {Does “Information” Provide a Compelling Framework for a Theory of Natural Selection? Grounds for Caution},
  journal = {Philosophy of Science},
  year = {2014},
  volume = {81},
  number = {1},
  pages = {22--30}
}
Planat, M. Drawing quantum contextuality with ‘dessins d’enfants’ 2015 It From Bit or Bit From It?, pp. 37-50  incollection  
BibTeX:
@incollection{planat-drawing-2015,
  author = {Planat, Michel},
  title = {Drawing quantum contextuality with ‘dessins d’enfants’},
  booktitle = {It From Bit or Bit From It?},
  publisher = {Springer},
  year = {2015},
  pages = {37--50}
}
Hobson, J.A., Pace-Schott, E.F. and Stickgold, R. Dreaming and the brain: toward a cognitive neuroscience of conscious states 2000 Behav Brain Sci
Vol. 23 
article DOI URL 
BibTeX:
@article{hobson-dreaming-2000,
  author = {Hobson, J. A. and Pace-Schott, E. F. and Stickgold, R.},
  title = {Dreaming and the brain: toward a cognitive neuroscience of conscious states},
  journal = {Behav Brain Sci},
  year = {2000},
  volume = {23},
  url = {http://dx.doi.org/10.1017/S0140525X00003976},
  doi = {http://doi.org/10.1017/S0140525X00003976}
}
Engel, A.K., Fries, P. and Singer, W. Dynamic predictions: oscillations and synchrony in top-down processing 2001 Nat Rev Neurosci
Vol. 2 
article DOI URL 
BibTeX:
@article{engel-dynamic-2001,
  author = {Engel, A. K. and Fries, P. and Singer, W.},
  title = {Dynamic predictions: oscillations and synchrony in top-down processing},
  journal = {Nat Rev Neurosci},
  year = {2001},
  volume = {2},
  url = {http://dx.doi.org/10.1038/35094565},
  doi = {http://doi.org/10.1038/35094565}
}
Wolf, F., Engelken, R., Puelma-Touzel, M., Weidinger, J.D.F. and Neef, A. Dynamical models of cortical circuits 2014 Current Opinion in Neurobiology
Vol. 25, pp. 228 - 236 
article DOI URL 
Abstract: Cortical neurons operate within recurrent neuronal circuits. Dissecting their operation is key to understanding information processing in the cortex and requires transparent and adequate dynamical models of circuit function. Convergent evidence from experimental and theoretical studies indicates that strong feedback inhibition shapes the operating regime of cortical circuits. For circuits operating in inhibition-dominated regimes, mathematical and computational studies over the past several years achieved substantial advances in understanding response modulation and heterogeneity, emergent stimulus selectivity, inter-neuron correlations, and microstate dynamics. The latter indicate a surprisingly strong dependence of the collective circuit dynamics on the features of single neuron action potential generation. New approaches are needed to definitely characterize the cortical operating regime.
BibTeX:
@article{wolf-dynamical-2014,
  author = {Wolf, Fred and Engelken, Rainer and Puelma-Touzel, Maximilian and Weidinger, Juan Daniel Flórez and Neef, Andreas},
  title = {Dynamical models of cortical circuits},
  journal = {Current Opinion in Neurobiology},
  year = {2014},
  volume = {25},
  pages = {228 -- 236},
  url = {http://www.sciencedirect.com/science/article/pii/S0959438814000324},
  doi = {http://doi.org/10.1016/j.conb.2014.01.017}
}
Woschni, E.-G. Dynamics of measurement-relations to system and information theory 1977 Journal of Physics E: Scientific Instruments
Vol. 10(11), pp. 1081 
article URL 
Abstract: The article begins by pointing out the requirements placed on the dynamics of measurement due to advanced computing techniques and automation engineering. This is followed by a survey of basic terms such as root mean square deviation and other error definitions, their correlations and interpretations with the aid of modern geometry in Euclidean and non-Euclidean spaces. Relations between data storage and measuring errors are shown by use of the findings of the information theory. Furthermore, methods for the analysis and synthesis, i.e. optimisation of measuring systems are discussed.
BibTeX:
@article{woschni-dynamics-1977,
  author = {Woschni, E.-G.},
  title = {Dynamics of measurement-relations to system and information theory},
  journal = {Journal of Physics E: Scientific Instruments},
  year = {1977},
  volume = {10},
  number = {11},
  pages = {1081},
  url = {http://stacks.iop.org/0022-3735/10/i=11/a=001}
}
Norton, J.D. Eaters of the lotus: Landauer's principle and the return of Maxwell's demon 2005 Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics
Vol. 36(2), pp. 375 - 411 
article DOI URL 
Abstract: Landauer's principle is the loosely formulated notion that the erasure of n bits of information must always incur a cost of k ln n in thermodynamic entropy. It can be formulated as a precise result in statistical mechanics, but for a restricted class of erasure processes that use a thermodynamically irreversible phase space expansion, which is the real origin of the law's entropy cost and whose necessity has not been demonstrated. General arguments that purport to establish the unconditional validity of the law (erasure maps many physical states to one; erasure compresses the phase space) fail. They turn out to depend on the illicit formation of a canonical ensemble from memory devices holding random data. To exorcise Maxwell's demon one must show that all candidate devices—the ordinary and the extraordinary—must fail to reverse the second law of thermodynamics. The theorizing surrounding Landauer's principle is too fragile and too tied to a few specific examples to support such general exorcism. Charles Bennett's recent extension of Landauer's principle to the merging of computational paths fails for the same reasons as trouble the original principle.
BibTeX:
@article{norton-eaters-2005,
  author = {Norton, John D.},
  title = {Eaters of the lotus: Landauer's principle and the return of Maxwell's demon},
  journal = {Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics},
  year = {2005},
  volume = {36},
  number = {2},
  pages = {375 -- 411},
  url = {http://www.sciencedirect.com/science/article/pii/S1355219804000851},
  doi = {http://doi.org/10.1016/j.shpsb.2004.12.002}
}
Ginzburg, L.R., Colyvan, M. and NetLibrary, I. Ecological orbits: how planets move and populations grow 2004   book  
Abstract: This book proposes a new approach to population biology and ecology. The current paradigm for analyzing population dynamics focuses attention on the growth rate as the main variable responding to the environment, and leads often to predictions of runaway acceleration seldom actually seen in nature. This book proposes and develops an inertial view of population growth, taking note of acceleration, or rate of change of the growth rate between consecutive generations, which allows a simpler model for complex population dynamics, often without invoking species interations, that appears to fit the actual outcomes better than traditional Lotka-Volterra modeling. The maternal effect is presented as a major driver for this shift in modeling orientation. Investment of mothers in the quality of their daughters makes the rate of reproduction depend not only on the current environment, but also on the environment experienced by the previous generation. Ginzburg is a highly respected ecologist, and this book should be read by most population biologists and ecological modellers and by theoretical biologists and philosophers of science.;The main focus of the book is the presentation of the inertial view of population growth. This view provides a rather simple model for complex population dynamics, and is achieved at the level of the single species without invoking species interactions. An important part of the account is the maternal effect. Investment of mothers in the quality of their daughters makes the rate of reproduction of the current generation depend not only on the current environment, but also on the environment experienced by the previous generation.;
BibTeX:
@book{ginzburg-ecological-2004,
  author = {Ginzburg, Lev R. and Colyvan, Mark and NetLibrary, Inc},
  title = {Ecological orbits: how planets move and populations grow},
  publisher = {Oxford University Press},
  year = {2004}
}
Giddings, S. and Shi, Y. Effective field theory models for nonviolent information transfer from black holes 2014 PHYSICAL REVIEW D
Vol. 89(12) 
article  
Abstract: Transfer of quantum information from the interior of a black hole to its atmosphere is described, in models based on effective field theory. This description illustrates that such transfer need not be violent to the semiclassical geometry or to infalling observers, and in particular can avoid producing a singular horizon or "firewall". One can specifically quantify the rate of information transfer and show that a rate necessary to unitarize black hole evaporation produces a relatively mild modification to the stress tensor near the horizon. In an exterior description of the transfer, the new interactions responsible for it are approximated by "effective sources" acting on fields in the black hole atmosphere. If the necessary interactions couple to general modes in the black hole atmosphere, one also finds a straightforward mechanism for information transfer rates to increase when a black hole is mined, avoiding paradoxical behavior. Correspondence limits are discussed, in the presence of such new interactions, for both small black holes and large ones; the near-horizon description of the latter is approximately that of Rindler space.
BibTeX:
@article{giddings-effective-2014,
  author = {Giddings, SB and Shi, YB},
  title = {Effective field theory models for nonviolent information transfer from black holes},
  journal = {PHYSICAL REVIEW D},
  year = {2014},
  volume = {89},
  number = {12}
}
Lund, A.P. Efficient quantum computing with weak measurements 2011 New Journal of Physics
Vol. 13(5), pp. 053024 
article URL 
Abstract: Projective measurements with high quantum efficiency are often assumed to be required for efficient circuit-based quantum computing. We argue that this is not the case and show that the fact that they are not required was actually known previously but was not deeply explored. We examine this issue by giving an example of how to perform the quantum-ordering-finding algorithm efficiently using non-local weak measurements considering that the measurements used are of bounded weakness and some fixed but arbitrary probability of success less than unity is required. We also show that it is possible to perform the same computation with only local weak measurements, but this must necessarily introduce an exponential overhead.
BibTeX:
@article{lund-efficient-2011,
  author = {Lund, A. P.},
  title = {Efficient quantum computing with weak measurements},
  journal = {New Journal of Physics},
  year = {2011},
  volume = {13},
  number = {5},
  pages = {053024},
  url = {http://stacks.iop.org/1367-2630/13/i=5/a=053024}
}
Cossar, J.D. and Arrowsmith, C.H. Efficient Strategies for Production of Eukaryotic Proteins 2012 Comprehensive Biophysics, pp. 4 - 33  incollection URL 
Abstract: Cumulative experience based on structural genomics of thousands of eukaryotic proteins enables a rational and informed approach to be taken when setting up a new program for any level of throughput. This chapter presents strategies for the production of proteins for structural or biochemical functional studies. Emphasis is placed on delivery of high-purity, stable proteins in a format suitable for crystallization or nuclear magnetic resonance spectroscopy and having defined biochemical function. A critical component relates to design and selection of a polypeptide with the minimal sequence determinants for a particular biochemical activity, in order to facilitate structural analysis, while optimizing protein expression and solubility. The advantages and disadvantages for various production platforms are discussed, allowing for the best approach to be selected for any particular facility. Potential rescue strategies and quality assurance systems are described.
BibTeX:
@incollection{cossar-efficient-2012,
  author = {Cossar, J. D. and Arrowsmith, C. H.},
  title = {Efficient Strategies for Production of Eukaryotic Proteins},
  booktitle = {Comprehensive Biophysics},
  publisher = {Elsevier},
  year = {2012},
  pages = {4 -- 33},
  url = {http://www.sciencedirect.com/science/article/pii/B9780123749208001041}
}
Kläy, M.P. Einstein-Podolsky-Rosen experiments: the structure of the probability space. I 1988 Foundations of Physics Letters
Vol. 1(3), pp. 205-244 
article DOI URL 
Abstract: Incompatibility of measurements, central to quantum mechanics, is captured in the formalism of empirical logic, which is based on a generalization of the notion of a sample space in Kolmogoroff's axiomatic theory of probability. In composite empirical systems of the kind considered in the Einstein-Podolsky-RosenGedankenexperiment, incompatibility gives rise to the notion of influence, which is closely related to stochastic independence.
BibTeX:
@article{klay-einstein-podolsky-rosen-1988,
  author = {Kläy, Matthias P.},
  title = {Einstein-Podolsky-Rosen experiments: the structure of the probability space. I},
  journal = {Foundations of Physics Letters},
  year = {1988},
  volume = {1},
  number = {3},
  pages = {205--244},
  url = {http://dx.doi.org/10.1007/BF00690066},
  doi = {http://doi.org/10.1007/BF00690066}
}
Bender, C.J. Electron Magnetic Resonance 2012 Comprehensive Biophysics, pp. 425 - 493  incollection URL 
Abstract: An introduction to electron magnetic resonance (EMR) with applications in biophysical studies is presented at the level of nonspecialist or beginning graduate student. The first half of the chapter briefly introduces the resonance phenomenon, a typical EMR\ spectrum and its interpretation, and describes fundamental applications of electron resonance spectroscopy in free radical research, identification and characterization of metalloproteins and reaction intermediates, spin probes, and imaging. The second half of the chapter describes the magnetochemical origins of resonance spectroscopy and the steps that have led to modern EMR\ techniques.
BibTeX:
@incollection{bender-electron-2012,
  author = {Bender, C. J.},
  title = {Electron Magnetic Resonance},
  booktitle = {Comprehensive Biophysics},
  publisher = {Elsevier},
  year = {2012},
  pages = {425 -- 493},
  url = {http://www.sciencedirect.com/science/article/pii/B9780123749208001247}
}
Blazsó, T. Elementary information in physics 2011 Physics Essays
Vol. 25(1), pp. 83-90 
article  
Abstract: The paper is an attempt to discover the roots of information in physics. The usual, entropy-based information is a special case, which is closely related to the quantum physical bound state. On the other hand, free particles produce increasing information. A short review of the present theories on the relation of the physical quantities to information and possible future tasks are also included. The present paper will not investigate fully the relationship between physical and communication information but outlines some similarities and differences. [DOI: 10.4006/0836-1398-25.1.83]; The paper is an attempt to discover the roots of information in physics. The usual, entropy-based information is a special case, which is closely related to the quantum physical bound state. On the other hand, free particles produce increasing information. A short review of the present theories on the relation of the physical quantities to information and possible future tasks are also included. The present paper will not investigate fully the relationship between physical and communication information but outlines some similarities and differences. (C) 2012 Physics Essays Publication. [DOI: 10.4006/0836-1398-25.1.83]
BibTeX:
@article{blazso-elementary-2011,
  author = {Blazsó, Tibor},
  title = {Elementary information in physics},
  journal = {Physics Essays},
  year = {2011},
  volume = {25},
  number = {1},
  pages = {83--90}
}
Cover, T.M. and Thomas, J.A. Elements of information theory 2006   book  
BibTeX:
@book{cover-elements-2006,
  author = {Cover, T. M. and Thomas, Joy A.},
  title = {Elements of information theory},
  publisher = {Wiley-Interscience},
  year = {2006},
  edition = {2nd}
}
Wimsatt, W.C. Emergence as Non-Aggregativity and the Biases of Reductionisms 2000 Foundations of Science
Vol. 5(3), pp. 269-297 
article  
Abstract: Most philosophical accounts of emergence are incompatible with reduction. Most scientists regard a system property as emergent relative to properties of its parts if it depends upon their mode of organization-a view consistent with reduction. Emergence is a failure of aggregativity, in which “the whole is nothing more than the sum of its parts”. Aggregativity requires four conditions, giving powerful tools for analyzing modes of organization. Differently met for different decompositions of the system, and in different degrees, the structural conditions can provide evaluation criteria for choosing decompositions, “natural kinds”, and detecting functional localization fallacies, approximations, and various biases of vulgar reductionisms. This analysis of emergence and use of these conditions as heuristics is consistent with a broader reductionistic methodology.
BibTeX:
@article{wimsatt-emergence-2000,
  author = {Wimsatt, William C.},
  title = {Emergence as Non-Aggregativity and the Biases of Reductionisms},
  journal = {Foundations of Science},
  year = {2000},
  volume = {5},
  number = {3},
  pages = {269--297}
}
Bedau, M. and Humphreys, P. Emergence: contemporary readings in philosophy and science 2008   book  
BibTeX:
@book{bedau-emergence:-2008,
  author = {Bedau, Mark and Humphreys, Paul},
  title = {Emergence: contemporary readings in philosophy and science},
  publisher = {MIT Press},
  year = {2008}
}
Shamir, M. Emerging principles of population coding: in search for the neural code 2014 Current Opinion in Neurobiology
Vol. 25, pp. 140 - 148 
article DOI URL 
Abstract: Population coding theory aims to provide quantitative tests for hypotheses concerning the neural code. Over the last two decades theory has focused on analyzing the ways in which various parameters that characterize neuronal responses to external stimuli affect the information content of these responses. This article reviews and provides an intuitive explanation for the major effects of noise correlations and neuronal heterogeneity, and discusses their implications for our ability to investigate the neural code. It is argued that to test neural code hypotheses further, additional constraints are required, including relating trial-to-trial variation in neuronal population responses to behavioral decisions and specifying how information is decoded by downstream networks.
BibTeX:
@article{shamir-emerging-2014,
  author = {Shamir, Maoz},
  title = {Emerging principles of population coding: in search for the neural code},
  journal = {Current Opinion in Neurobiology},
  year = {2014},
  volume = {25},
  pages = {140 -- 148},
  url = {http://www.sciencedirect.com/science/article/pii/S0959438814000105},
  doi = {http://doi.org/10.1016/j.conb.2014.01.002}
}
Carnap, R. Empiricism, Semantics, and Ontology 1950 Revue Internationale de Philosophie
Vol. 4(11), pp. 20-40 
article  
BibTeX:
@article{carnap-empiricism-1950,
  author = {Carnap, Rudolf},
  title = {Empiricism, Semantics, and Ontology},
  journal = {Revue Internationale de Philosophie},
  year = {1950},
  volume = {4},
  number = {11},
  pages = {20--40}
}
Carnap, R. Empiricism, semantics, and ontology 2011 , pp. 249-264  incollection  
BibTeX:
@incollection{carnap-empiricism-2011,
  author = {Carnap, Rudolf},
  title = {Empiricism, semantics, and ontology},
  year = {2011},
  pages = {249--264}
}
Saravani, M., Afshordi, N. and Mann, R.B. Empty black holes, firewalls, and the origin of Bekenstein-Hawking entropy 2014 International Journal of Modern Physics D
Vol. 23(13) 
article  
BibTeX:
@article{saravani-empty-2014,
  author = {Saravani, Mehdi and Afshordi, Niayesh and Mann, Robert B.},
  title = {Empty black holes, firewalls, and the origin of Bekenstein-Hawking entropy},
  journal = {International Journal of Modern Physics D},
  year = {2014},
  volume = {23},
  number = {13}
}
McIrvine, E.C. and Tribus, M. Energy and Information 1971 Scientific American
Vol. 225(3), pp. 179-88. 
article  
BibTeX:
@article{mcirvine-energy-1971,
  author = {McIrvine, Edward C and Tribus, Myron},
  title = {Energy and Information},
  journal = {Scientific American},
  year = {1971},
  volume = {225},
  number = {3},
  pages = {179--88.}
}
Lam, V. Entities Without Intrinsic Physical Identity 2014 Erkenntnis
Vol. 79(5), pp. 1157-1171 
article  
Abstract: This paper critically discusses recent objections that have been raised against the contextual understanding of fundamental physical objects advocated by non-eliminative ontic structural realism. One of these recent objections claims that such a purely relational understanding of objects cannot account for there being a determinate number of them. A more general objection concerns a well-known circularity threat: relations presuppose the objects they relate and so cannot account for them. A similar circularity objection has also been raised within the framework of the weak discernibility claims made in the last few years about quantum particles. We argue that these objections rely either on mere metaphysical prejudice or on confusing the logico-mathematical formalism within which a physical theory is formulated with the physical theory itself. Furthermore, we defend the motivations for taking numerical diversity as a primitive fact in this context.;This paper critically discusses recent objections that have been raised against the contextual understanding of fundamental physical objects advocated by non-eliminative ontic structural realism. One of these recent objections claims that such a purely relational understanding of objects cannot account for there being a determinate number of them. A more general objection concerns a well-known circularity threat: relations presuppose the objects they relate and so cannot account for them. A similar circularity objection has also been raised within the framework of the weak discernibility claims made in the last few years about quantum particles. We argue that these objections rely either on mere metaphysical prejudice or on confusing the logico-mathematical formalism within which a physical theory is formulated with the physical theory itself. Furthermore, we defend the motivations for taking numerical diversity as a primitive fact in this context.;This paper critically discusses recent objections that have been raised against the contextual understanding of fundamental physical objects advocated by non-eliminative ontic structural realism. One of these recent objections claims that such a purely relational understanding of objects cannot account for there being a determinate number of them. A more general objection concerns a well-known circularity threat: relations presuppose the objects they relate and so cannot account for them. A similar circularity objection has also been raised within the framework of the weak discernibility claims made in the last few years about quantum particles. We argue that these objections rely either on mere metaphysical prejudice or on confusing the logico-mathematical formalism within which a physical theory is formulated with the physical theory itself. Furthermore, we defend the motivations for taking numerical diversity as a primitive fact in this context.;
BibTeX:
@article{lam-entities-2014,
  author = {Lam, Vincent},
  title = {Entities Without Intrinsic Physical Identity},
  journal = {Erkenntnis},
  year = {2014},
  volume = {79},
  number = {5},
  pages = {1157--1171}
}
Gray, R.M. Entropy 2011 Entropy and Information Theory, pp. 61-95  incollection  
BibTeX:
@incollection{gray-entropy-2011-1,
  author = {Gray, Robert M.},
  title = {Entropy},
  booktitle = {Entropy and Information Theory},
  publisher = {Springer US},
  year = {2011},
  pages = {61--95}
}
Barnum, H., Barrett, J., Clark, L.O., Leifer, M., Spekkens, R., Nicholas Stepanik, Wilce, A. and Wilke, R. Entropy and information causality in general probabilistic theories 2012 New Journal of Physics
Vol. 14(12), pp. 129401 
article URL 
Abstract: In this addendum to our paper (2010 New J. Phys. 12 033024), we point out that an elementary consequence of the strong subadditivity inequality allows us to strengthen one of the main conclusions of that paper.
BibTeX:
@article{barnum-entropy-2012,
  author = {Barnum, Howard and Barrett, Jonathan and Clark, Lisa Orloff and Leifer, Matthew and Spekkens, Robert and Nicholas Stepanik and Wilce, Alex and Wilke, Robin},
  title = {Entropy and information causality in general probabilistic theories},
  journal = {New Journal of Physics},
  year = {2012},
  volume = {14},
  number = {12},
  pages = {129401},
  url = {http://stacks.iop.org/1367-2630/14/i=12/a=129401}
}
Gray, R.M. Entropy and information theory 2011   book  
BibTeX:
@book{gray-entropy-2011,
  author = {Gray, Robert M},
  title = {Entropy and information theory},
  publisher = {Springer Science & Business Media},
  year = {2011}
}
Short, A.J. and Wehner, S. Entropy in general physical theories 2010 New Journal of Physics
Vol. 12(3), pp. 033023 
article URL 
Abstract: Information plays an important role in our understanding of the physical world. Hence we propose an entropic measure of information for any physical theory that admits systems, states and measurements. In the quantum and classical worlds, our measure reduces to the von Neumann and Shannon entropies, respectively. It can even be used in a quantum or classical setting where we are only allowed to perform a limited set of operations. In a world that admits superstrong correlations in the form of non-local boxes, our measure can be used to analyze protocols such as superstrong random access encodings and the violation of 'information causality'. However, we also show that in such a world no entropic measure can exhibit all the properties we commonly accept in a quantum setting. For example, there exists no 'reasonable' measure of conditional entropy that is subadditive. Finally, we prove a coding theorem for some theories that is analogous to the quantum and classical settings, providing us with an appealing operational interpretation.
BibTeX:
@article{short-entropy-2010,
  author = {Short, Anthony J. and Wehner, Stephanie},
  title = {Entropy in general physical theories},
  journal = {New Journal of Physics},
  year = {2010},
  volume = {12},
  number = {3},
  pages = {033023},
  url = {http://stacks.iop.org/1367-2630/12/i=3/a=033023}
}