Search This Blog

Friday 15 May 2020

Jussi Jylkkä of Åbo Akademi University In Finland: A Physicalist Theory of Consciousness.

Dr Bruce Long interviews Professor Jussi Jylkkä of Åbo Akademi University in Finland about his physicalist identity theory of phenomenal consciousness.


Thursday 23 April 2020

The Probabilistic Difference Maker Theory of Information is ontologically and conceptually circular.


(Original Post at IIMX)

The probabilistic difference maker theory of information seeks to solve recondite problems with the veridicality thesis for semantic information (information must be alethic or truth apt, and true) and with conceptions of information. However, there are problems with the premises of PDMT, especially its cognitivist-subjectivist premises. It conflates transmission of information with information, eliminates causation as a necessary basis for transmission, conflates abstract information with physical information by eliminating a source token as necessary and instead making signal content about source types. What I am most interested in here, however, is a begging of the question associated with using objective background data or information as a necessary basis for defining information.
 
The nature of information transmission is central to theories of the nature of information, semantic information, and information dynamics (considered at various levels of abstraction). I will refer in this post only to information transmission rather than the more metaphorical and less scientifically semantically constrained concept of information flow. This concurs with Andrea Scarantino and Gaultiero Piccinini's (2011) approach, and with that of (Scarantino, 2015). ‘Information flow’ and ‘information transmission’ are not always used exactly synonymously in the philosophy of information, and the concepts of information transmission and information flow are not equivalent.
 
For example, in many informational logics, such as substructural relevant logics and the positive informational logic of for informational (deductive) inference[1], flow (and relational flow operators) is usually defined according to an abstractive model such as Fred Dretske’s transitive xerox principle, according to which, “if A carries[bears/receives] the information that B, and B carries [bears/receives] the information that C, then A carries [bears/receives] the information that C.” (Dretske, 1981, p. 1117). It’s implied (but usually not stated[2]) that the concept of flow in these logics somehow maps to, is grounded in, or even supervenes (strongly) upon causal transmission. Thus, transmission is still the relevant information theoretic and mathematical communication-theoretic concept to use in analysing PDMT.
 
Researchers, including (Scarantino 2015) sometimes equivocate on information and information transmission, or at least fail to disambiguate them. However, information is not information transmission. When referring to a source to destination and causal-channel based system, information is generated at the source, and is transmitted in an encoded signal to the receiver and destination. Transmission is a sufficient, but not necessary, condition for the obtaining of information: any causally based transmission of any pattern generated at a source structure necessarily involves information transmission. An information source or a dynamical physical process or structure is a sufficient and necessary condition for the obtaining of information. (If such existential conditions are sought, I suggest that the most important one is causal structure, which then points to a familiar structural realism debate).
 
 
 
 
Concepts of information transmission are important to Dretske’s epistemically motivated veridicality thesis for semantic information, according to which semantic information must have the property of being true, or must somehow encapsulate truth (the latter encapsulation based conception being due to Floridi). Scarantino and Piccinini (2011) specifically oppose this thesis (which I also reject) using their probabilistic conception of information and information transmission to do so (Floridi 2011, 91, 116–17, 245–46; Fresco and Michael 2016, 134; Adams 2003, 476; Dretske 1981b, 45).
 
According to PDMT, there are only two necessary conditions for the obtaining of information (transmission, in fact, which his conflated with information in their theory):
  1. The structure of the token signal (regarded as evidence) (Scarantino 2015, 419, 422–23).
  2. Mind-dependent mind-interpreted (Bayesian, or not) background data grounded by agent acquired knowledge of a type of signal to source correspondence
  1. 1. is regarded as sufficient for the probabilistic inferential implication of the existence a token source of origin of the requisite type or kind (regarded as a Bayesian hypothesis). However, neither the existence of such a token source, nor a state thereof, is a necessary condition for the obtaining of information transmission according to PDMT.
 
Our best applied scientific theories of information say there can be no information transmission without causally sustained signals in physical channels causally connected to token sources of origin. The distinction between Shannon information and physical causation is well understood, but it’s exact basis is not broadly ratified and there are significant problems with it with respect to individual signal content, semantic information, and transmission of true messages (Dretske, 1983). According to the mathematical theory of communication, transmission does necessarily involve physical causal pathways. Similarly, Shannon’s definition of the obtaining of real information has the necessary existential condition of, or is existentially dependent upon, a physical stochastic process (Cover & Thomas, 2006; Shannon, Claude E., 1948). One can get information from the lack of a signal, but this one exception also requires the presence of a physical channel with stable channel conditions (invariant physical properties constrained on a natural nomic or lawful basis) ad a token source to generate the signal.
 
There are two main problems with PDMT:
 
3. Its elimination of temporal, causal, and directional in re (in a channel and source coupling) signal structure from the concept of transmission in connection with the replacement of frequentist statistics and source-signal casual pathways with Bayesian inference and confirmation, and
4.Its inclusion of mind dependence as a necessary condition of the obtaining of transmission.
 
(3.) is the core thesis of PDMT:
[T]he transmission of natural information entails nothing more than the truth of a probabilistic claim…signals carry natural information by changing the probability of what they are about…On this view, spots carry natural information about measles not because all and only patients with spots have measles but because patients with spots are more likely to have measles than patients without spots. (Scarantino and Piccinini 2011, 70; Martinez and Sequoiah-Grayson 2019, sec. 2.2-3.1, 29)
The ‘nothing more’ claim has been multiply opposed by philosophers of information (Lloyd, 1989, p. 64; Lombardi, 2004, pp. 113–117; Millikan, 2013, pp. 141–142; Shea, 2007, p. 421; Stegmann, 2015, pp. 874–877). Scarantino is interested in this premise and principle because he opposes Fred Dretske’s veridicality thesis: that information requires truth (a thesis upheld by Luciano Floridi). Scarantino wants information to have a probabilistic basis, and not require veridicality. This is more in keeping with Shannonian classical information theory and avoids some of the pitfalls of the Dretskian approach (the requirement of p=1 or statistical certainty about the state of a source for the signal and message received from a channel). I happen to agree with the rejection of the veridicality thesis because I regard information as a truthmaker rather than a truth bearer. However, it does not follow from the truth of necessary proportionality of transmission that a token causally connected source is not required for information transmission.
 
PDMT’s primary revision is thus the replacement of combined statistical and causal grounds of transmission with Bayesian probabilistic inference alone. It regards the signal structure as evidence in a Bayesian statistical framework. The (state of the) source is mapped to a Bayesian hypothesis such that the probability that the source, conditional upon evidence at the signal, and given mind dependent and psychologically interpreted propositional background data, is the information content. This constitutes information according to PDMT. However, it’s really a (revisionary and flawed) conception of transmission, and, as already noted, transmission is not identical to information, since it’s information that is transmitted.
 
An immediate response to this observation might be that information is the transmission of a signal that results in the objective reduction in uncertainty at the destination about the source. However, Shannon’s theory also talks about information being generated at a source, and if that is to be taken seriously in ontological terms, then information is not a reduction in uncertainty alone (or perhaps at all) as probabilism about the nature of information would have it. Moreover, it would preclude information being encapsulated by structures in algorithmic information theory, and this is too quick, at best.
 
PDMT is multiply revisionary with respect to the classical conception of transmission, but its subjectivist and cognitivist revisions that make the obtaining of information dependent upon propositional mind-dependent background data that are the most troublesome. Mind-dependence prevents PDMT information[3] from being an objective commodity as Scarantino claims, because the mind dependence is ineliminable from the establishment of, and agent access to, the background information or data:

[T]he transmission of natural information entails nothing more than the truth of a probabilistic claim…signals carry natural information by changing the probability of what they are about…On this view, spots carry natural information about measles not because all and only patients with spots have measles but because patients with spots are more likely to have measles than patients without spots. (Scarantino and Piccinini 2011, 70; Martinez and Sequoiah-Grayson 2019, sec. 2.2-3.1, 29)

It’s a straightforward problem of petitio principii or begging the question. Background data are information. They must be. You cannot use reference to meaning as part of the basis (or part of the premises) for a definition of meaning, reference to knowledge as part of a basis for defining knowledge, or reference to belief to define belief, and so on. Likewise, you cannot use information, or reference to information as part of the basis for a definition of information. PDMT does this. Objective (Bayesian or otherwise) background data constitutes part of the basis for the propositional mental content that is a necessary condition for the obtaining of information in a signal. This background data is required to track and refer to information sources types, since information source tokens are not required for information to obtain according to PDMT.
 
Some might allege that this claim is due to a failure to understand or account for the nature of Bayesian (or else non-Bayesian) background data. However, as salutary and relevant as interpretations of probability and associated disagreements about how Bayesian probability works (subjective versus objective interpretations) – they have little bearing on this question. Whatever background data is, it’s information, even if is objectively determined. However, according to PDMT it cannot be objectively determined at all, since information is necessarily mind-dependent. The problems caused by the circularity are even more apparent in these terms.
 
 
 Notes
 
[1] (Barwise et al., 1995; D’Alfonso, 2014; Restall, 1996, pp. 466–467; Sequoiah-Grayson, 2009) [2] Note that only the term ‘flow’, but never the term ‘transmission’ is used in (Aucher, 2014; Restall, 1996, 2018) [3] Or transmission, since PDMT seems to equivocate on information and information transmission.

Bibliography

Aucher, G. (2014). Dynamic Epistemic Logic as a Substructural Logic. In A. Baltag & S. Smets (Eds.), Johan van Benthem on Logic and Information Dynamics (pp. 855–880). Springer International Publishing. https://doi.org/10.1007/978-3-319-06025-5_33
Barwise, J., Gabbay, D., & Hartonas, C. (1995). On the Logic of Information Flow. Logic Journal of IGPL3(1), 7–49. https://doi.org/10.1093/jigpal/3.1.7
Cover, T. M., & Thomas, J. A. (2006). Elements of information theory (2nd ed.). Wiley-Interscience.
D’Alfonso, S. (2014). The Logic of Knowledge and the Flow of Information. Minds and Machines24(3), 307–325.
Dretske, F. (1981). Knowledge and the flow of information (eBook: Kindle). Blackwell. https://www.amazon.com.au/Knowledge-Flow-Information-David-Hume-ebook/dp/B00IL4MAIM/ref=sr_1_1_twi_kin_1?ie=UTF8&qid=1552637463&sr=8-1&keywords=Knowledge+and+the+Flow+of+Information
Lloyd, D. E. (1989). Simple minds. MIT Press.
Lombardi, O. (2004). What is Information? Foundations of Science9(2), 105–134.
Millikan, R. G. (2013). Natural information, intentional signs and animal communication. In U. E. Stegmann (Ed.), Animal Communication Theory: Information and Influence (pp. 133–148). Cambridge University Press; Cambridge Core. https://doi.org/10.1017/CBO9781139003551.008
Restall, G. (1996). Information Flow and Relevant Logics. In J. Seligman & D. Westerståhl (Eds.), Logic, Language and Computation (pp. 463–477). CSLI Publications, Stanford.
Restall, G. (2018). Substructural Logics. In E. N. Zalta (Ed.), The Stanford Encyclopedia of Philosophy (Spring 2018). Metaphysics Research Lab, Stanford University. https://plato.stanford.edu/archives/spr2018/entries/logic-substructural/
Scarantino, A. (2015). Information as a Probabilistic Difference Maker. Australasian Journal of Philosophy93(3), 1–25. https://doi.org/10.1080/00048402.2014.993665
Scarantino, A., & Piccinini, G. (2011). Information without Truth. In Putting Information First (pp. 66–83). John Wiley & Sons, Ltd. https://doi.org/10.1002/9781444396836.ch5
Sequoiah-Grayson, S. (2009). A Positive Information Logic for Inferential Information. Synthese167(2), 409–431.
Shannon, Claude E. (1948). A Mathematical theory of Communication: Reprinted with corrections 1998 (50th anniversary release of 1948 paper)). The Bell System Technical Journal.
Shea, N. (2007). Consumers Need Information: Supplementing Teleosemantics with an Input Condition. Philosophy and Phenomenological Research75(2), 404–435.



Stegmann, U. (2015). Prospects for Probabilistic Theories of Natural Information. Erkenntnis80(4), 869–893. https://doi.org/10.1007/s10670-014-9679-9

Tuesday 21 April 2020

Pluralism versus unification about the nature of information

(Original post at IIMX)

The nature of information and the formulation of unificatory theories of the nature of information and semantic information (when they are regarded as separate[1]) are still regarded as being grounded in a set of open metaphysical, logical, metaphilosophical, and methodological questions. Information is a polysemous concept across domains of scientific and non-scientific application and across contexts in those domains. Thus, most contemporary theorists in the field recognize pluralism about the nature of information as a correct metametaphysical assumption and premise for the philosophy of information (T. W. Deacon, 2008, p. 150). However, many of these theorists also present unificatory conceptions, including what Luciano Floridi has called an ur-concept of information. These conceptions are often surprisingly reductionist.

Developing any unifying conception of information involves dealing effectively with foundational problems in the metaphysics of information. For example, I’ve argued elsewhere that a basic but important error often committed by philosophers of information is that, in discussing information transmission and the nature of information, they often confuse and conflate the two. A similar problem arises for theories of the nature of information that focus on information content. Information and information content are not the same thing.
One needs a conception of the nature of information before one can present a conception of information transmission, and defining information in terms of, or as, transmission of information, is clearly circular. This is often an indirect error, but is sometimes due to carelessness. The carelessness often comes with an ambitious attempt to present a grand-schematic metametaphysics for information, or else to secure a central naturalistic conception on a probabilistic or other basis (See Stegmann and Scarantino). Likewise for defining information in terms of information content: one needs a conception of the nature of information before one defines the nature of information content, unless one is just referring to the information in some structure, source, process, system, mechanism, or other phenomenon.

It’s not impossible that a central epistemically and metaphysically satisfying conception of the nature of information might come to be broadly ratified. However, if so it’s likely to be tempered by both theory defeasibility and pluralism across different explanatory levels and in different metaphysical, scientific, and explanatory contexts. Theory defeasibility is especially likely where and when conceptions of information are influenced by or grounded in scientific conceptions of information and information transmission, which is common. This is precisely because of the defeasibility of scientific theories in general, albeit on an optimistic meta-inductive basis. Pluralism is inherent at a metametaphysical level since more than one influential scientific theory is understood to provide a strong basis for a conception of information. Correspondingly, the main scientific characterisations of information – probabilistic-entropic and computational – have very different ontic and conceptual bases, and various efforts to unify them have been attempted.

Luciano Floridi is probably the best known philosopher of information to present both a framework for the accommodation of pluralism in different explanatory settings, and a (Kantian transcendental) unificatory conception of information that provides a basis for a general conception of information across numerous domains and explanatory levels (Floridi, 2008, 2011b). His work is notable for this system, based as it is upon the concept of levels of abstraction, and also for the fact that he breaks with approaches recommended by earlier theorists, some of whom pursued multifaceted models accounting for, say, cognitive and non-cognitive situations and epistemic and non-epistemic content, in order to provide a unified conception of information across multiple metaphysical domains. However, it's notable that Floridi's Kantian ur-concept of information and informational structuralism is coupled with a sophisticated application of the computer-scientific idea of levels of abstraction (LoA) for complex systems architectures. The ur-conceptual Kantian transcendental differentiae de re that constitute Floridi's basis of information and data, on a somewhat reductionist basis, must reconcile coherently with the pluralist LoA schema. It's by no means a comfortable marriage of concepts given Floridi's naturalistic but non-reductionist ontic commitments.

In keeping with the openness of the question of the nature of information and semantic information, the impression that there has thus far not been a convincing solution to either, and the awareness of and commitment to pluralism about information as an in-principle metaphilosophical and meta-metaphysical normative guide, a number of theorists have offered multifaceted and often triadic (on different bases) conceptions of the nature of information.

Tripartite approaches are sometimes influenced by Peircian semiotic or other premises, and sometimes by the perceived existence of different natural and other aspects of information: cognitive internal versus external representation, sources, and flow, for example. They are also often married with existing extensive or else grand schema philosophies, which I suggest is another mistake. Hofkirchner's (2009) tripartite Hegelian conception recommends that a unified conception of information has to account for what he calls the syntactic, semantic, and pragmatic features of existing conceptions of information and the theories that deploy them. However, the Hegelian commitments tend to greatly impede the clarity and aptness of the project, as does the normative assumption that information has a tripartite character, motivated as it is by longstanding concerns about how to account for semantic information content.

Terrence Deacon (2007, 2008, 2013, 2010) outlines another grand-schematic marriage of scientific theory, older philosophy, and a triadic model. It's a Peircian tradic-semiotic and Shannonian classical hybrid which normatively stipulates, and emphasizes the need to account for, physical, referential, and normative components of information. The Peircian model forces a-priori conceptual and stipulative premises into Deacon’s metaphysical schema where otherwise the naturalistic scientific metaphysical - or at least naturalistic and scientistic - approach he favours would not need such. For example, one wonders why Deacon's systems-theoretic conceptions of regularity, or self-organising regular behavior, or perhaps even just regular dynamics – or even “pattern or cyclicity” or else “merely a tendency to consistently exhibit some possible states more often than others” - are not enough to fulfil the functional and explanatory role he instead ascribes to the Peircian notion of habit.

Peircian habit is applied by Deacon as a means to imply naturalistic teleology for what Deacon calls teleodynamic systems, on a teleonomic basis. It also supports a counterfactual notion of causal power or influence for what's absent from the state of a system or signal, which in turn is again associated with a subjectivist teleology for systems. The accompanying subjectivism both requires and accommodates the necessary inclusion of a recursively autopoietic and evolved Peircian interpretant and interpreter. Such a convolution, though inventive, is simply not necessary to explain natural information content. Subjective agents, interpreters, and consumers are not necessary conditions for natural information content in physical systems and their structures. There is no objective nomic or natural necessary reason for normative stipulative assertions to the contrary. Naturalistic attempts to reconcile unifactory conceptions of information with pluralist premises are troubled by numerous challenges. Attempting to import or apply existing and dated grand-schematic a-priorist metaphysical and semiotic schemas creates problems both for the coherence of resulting theories, and for the reputation of the philosophy of information. A scientific metaphysical approach that heavily de-emphasises a-priorist premises is preferable.
Notes


[1] According to some conceptions, natural and physical information is intrinsically semantic by way of causal indication (Long, 2014) [2] I suppose my inclusion of the term ‘scientistic’, although not with negative connotations, is intended to signify more a-priori conceptual analytic approaches.


Readings

Adriaans, P. (2010). A Critical Analysis of Floridi’s Theory of Semantic Information. Knowledge, Technology & Policy. https://doi.org/10.1007/s12130-010-9097-5

Deacon, T. W. (2007). Shannon – Boltzmann – Darwin: Redefining information (Part I). Cognitive Semiotics. https://doi.org/10.3726/81600_123

Deacon, T. W. (2008). Shannon - Boltzmann - Darwin: Redefining information (Part II). Cognitive Semiotics. https://doi.org/10.3726/81605_169

Deacon, T. W. (2010). What is missing from theories of information? In P. Davies & N. H. Gregersen (Eds.), Information and the Nature of Reality: From Physics to Metaphysics (pp. 146–169). Cambridge University Press. https://doi.org/10.1017/CBO9780511778759.008

Deacon, T. W. (2013). Incomplete nature : how mind emerged from matter (1st ed.). New York: W.W. Norton & Co.

Dennett, D. C. (2013). Aching Voids and Making Voids A review of Incomplete Nature: How Mind Emerged from Matter . By Terrence W. Deacon. New York: W. W. Norton & Company. $29.95. xvii + 602 p.; ill.; index. ISBN: 978-0-393-04991-6. 2012. . The Quarterly Review of Biology. https://doi.org/10.1086/673760

Dodig Crnkovic, G., & Hofkirchner, W. (2011). Floridi’s “Open Problems in Philosophy of Information”, Ten Years Later. Information. https://doi.org/10.3390/info2020327

Hofkirchner, W. (2009). How to achieve a unified theory of information. TripleC: Communication, Capitalism & Critique. Open Access Journal for a Global Sustainable Information Society. https://doi.org/10.31269/vol7iss2pp357-368

Floridi, L. (2008). A defence of informational structural realism. Synthese. https://doi.org/10.1007/s11229-007-9163-z

Fodor, J. (2012, May 24). What are trees about? London Review of Books, 1–5. Retrieved from https://www.lrb.co.uk/the-paper/v34/n10/jerry-fodor/what-are-trees-about

Scarantino, A. (2015). Information as a Probabilistic Difference Maker. Australasian Journal of Philosophy, 93(3), 1–25. https://doi.org/10.1080/00048402.2014.993665

Stegmann, U. (2015). Prospects for Probabilistic Theories of Natural Information. Erkenntnis, 80(4), 869–893. https://doi.org/10.1007/s10670-014-9679-9

Bad Grand-schema Philosophy of Information and where to find it (China)

Grand-schema Philosophies of Information are probably OK unless they're a-priorist or retro-fitting

IIMX Research: Pluralism versus unification about the nature of information