全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...
Information  2013 

Complexity over Uncertainty in Generalized Representational Information Theory (GRIT): A Structure-Sensitive General Theory of Information

DOI: 10.3390/info4010001

Keywords: information theory, representational information, categorization, concepts, invariance, complexity, information measure, subjective information

Full-Text   Cite this paper   Add to My Lib

Abstract:

What is information? Although researchers have used the construct of information liberally to refer to pertinent forms of domain-specific knowledge, relatively few have attempted to generalize and standardize the construct. Shannon and Weaver (1949) offered the best known attempt at a quantitative generalization in terms of the number of discriminable symbols required to communicate the state of an uncertain event. This idea, although useful, does not capture the role that structural context and complexity play in the process of understanding an event as being informative. In what follows, we discuss the limitations and futility of any generalization (and particularly, Shannon’s) that is not based on the way that agents extract patterns from their environment. More specifically, we shall argue that agent concept acquisition, and not the communication of states of uncertainty, lie at the heart of generalized information, and that the best way of characterizing information is via the relative gain or loss in concept complexity that is experienced when a set of known entities (regardless of their nature or domain of origin) changes. We show that Representational Information Theory perfectly captures this crucial?aspect of information and conclude with the first generalization of RIT to continuous?domains.

References

[1]  Devlin, K. Logic and Information; Cambridge University Press: Cambridge, UK, 1991.
[2]  Luce, R.D. Whatever happened to information theory in psychology? Rev. Gen. Psychol. 2003, 7, 183–188, doi:10.1037/1089-2680.7.2.183.
[3]  Floridi, L. The Philosophy of Information; Oxford University Press: Oxford, UK, 2011.
[4]  Devlin, K. Claude Shannon, 1916–2001. Focus News. Math. Assoc. Am. 2001, 21, 20–21.
[5]  Hartley, R.V.L. Transmission of information. Bell Syst. Tech. J. 1928, 7, 535–563.
[6]  Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423.
[7]  Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Urbana, IL, USA, 1949.
[8]  Bishop, C.M. Pattern Recognition and Machine Learning; Springer: New York, NY, USA, 2006.
[9]  Vigo, R. Representational information: A new general notion and measure of information. Inf. Sci. 2011, 181, 4847–4859, doi:10.1016/j.ins.2011.05.020.
[10]  Klir, G.J. Uncertainty and Information: Foundations of Generalized Information Theory; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2006.
[11]  Miller, G.A. The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychol. Rev. 1956, 63, 81–97, doi:10.1037/h0043158.
[12]  Laming, D.R.J. Information Theory of Choice-Reaction Times; Academic Press: New York, NY, USA, 1968.
[13]  Deweese, M.R.; Meister, M. How to measure the information gained from one symbol. Network 1999, 10, 325–340, doi:10.1088/0954-898X/10/4/303.
[14]  Butts, D.A. How much information is associated with a particular stimulus? Network 2003, 14, 177–187, doi:10.1088/0954-898X/14/2/301.
[15]  Laming, D. Statistical information, uncertainty, and Bayes’ theorem: Some applications in experimental psychology. In Symbolic and Quantitative Approaches to Reasoning with Uncertainty; Benferhat, S., Besnard, P., Eds.; Springer-Verlag: Berlin, Germany, 2001; pp. 635–646.
[16]  Dretske, F. Knowledge and the Flow of Information; MIT Press: Cambridge, MA, USA, 1981.
[17]  Tversky, A.; Kahneman, D. Availability: A heuristic for judging frequency and probability. Cogn. Psychol. 1973, 5, 207–233, doi:10.1016/0010-0285(73)90033-9.
[18]  Tversky, A.; Kahneman, D. Extension versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychol. Rev. 1983, 90, 293–315, doi:10.1037/0033-295X.90.4.293.
[19]  Vigo, R. A dialogue on concepts. Think 2010, 9, 109–120, doi:10.1017/S1477175609990297.
[20]  Vigo, R. Categorical invariance and structural complexity in human concept learning. J. Math. Psychol. 2009, 53, 203–221, doi:10.1016/j.jmp.2009.04.009.
[21]  Vigo, R. Towards a law of invariance in human conceptual behavior. In Proceedings of the 33rd Annual Conference of the Cognitive Science Society, Austin, TX, USA, 21 July 2011; Carlson, L., H?lscher, C., Shipley, T., Eds.; Cognitive Science Society: Austin, TX, USA, 2011.
[22]  Vigo, R. The gist of concepts. Cognition 2012. Submitted for publication.
[23]  Feldman, D.; Crutchfield, J. A survey of “Complexity Measures”, Santa Fe Institute, Santa Fe, NM, USA, 11, June, 1998, 1–15.
[24]  Vigo, R. A note on the complexity of Boolean concepts. J. Math. Psychol. 2006, 50, 501–510, doi:10.1016/j.jmp.2006.05.007.
[25]  Vigo, R. Modal similarity. J. Exp. Artif. Intell. 2009, 21, 181–196, doi:10.1080/09528130802113422.
[26]  Vigo, R.; Basawaraj, B. Will the most informative object stand? Determining the impact of structural context on informativeness judgments. J. Cogn. Psychol. 2012. in press.
[27]  Vigo, R.; Zeigler, D.; Halsey, A. Gaze and information processing during category learning: Evidence for an inverse law. Vis. Cogn. 2012. Submitted for publication.
[28]  Bourne, L.E. Human Conceptual Behavior; Allyn and Bacon: Boston, MA, USA, 1966.
[29]  Estes, W.K. Classification and Cognition; Oxford Psychology Series 22; Oxford University Press: Oxford, UK, 1994.
[30]  Garner, W.R. The Processing of Information and Structure; Wiley: New York, NY, USA, 1974.
[31]  Garner, W.R. Uncertainty and Structure as Psychological Concepts; Wiley: New York, NY, USA, 1962.
[32]  Kruschke, J.K. ALCOVE: An exemplar-based connectionist model of category learning. Psychol. Rev. 1992, 99, 22–44, doi:10.1037/0033-295X.99.1.22.
[33]  Aiken, H.H. The staff of the Computation Laboratory at Harvard University. In Synthesis of Electronic Computing and Control Circuits; Harvard University Press: Cambridge, UK, 1951.
[34]  Higonnet, R.A.; Grea, R.A. Logical Design of Electrical Circuits; McGraw-Hill: New York, NY, USA, 1958.
[35]  For the readers’ convenience, the parameterized variants of Equations (10) and (11) (see main text) respectively as introduced by Vigo (2009, 2011) are as follows: and . The parameter in both expressions stands for a human observer’s degree of sensitivity to (i.e., extent of detection of) the invariance pattern associated with the i-th dimension (this is usually a number in the closed real interval such that ). k is a scaling parameter in the closed real interval (D is the number of dimensions associated with the category) that indicates the overall ability of the subject to discriminate between dimensions (a larger number indicates higher discrimination) and c is a constant parameter in the closed interval which captures possible biases displayed by observers toward invariant information (c is added to the numerator and the denominator of the ratios that make up the logical or structural manifold of the well-defined category). Finally, s is a parameter that indicates the most appropriate measure of distance as defined by the generalized Euclidean metric (i.e., the Minkowski distance measure). In our investigation, the best predictions are achieved when s = 2 (i.e., when using the Euclidean metric). Optimal estimates of these free parameters on the aggregate data provide a baseline to assess any individual differences encountered in the pattern perception stage of the concept learning process and may provide a basis for more accurate measurements of subjective representational information
[36]  We could simply define the representational information of a well-defined category as the derivative of its structural complexity. We do not because our characterization of the degree of invariance of a concept function is based on a discrete counterpart to the notion of a derivative in the first place.
[37]  Nosofsky, R.M. Choice, similarity, and the context theory of classification. J. Exp. Psychol. Learn. Mem. Cogn. 1984, 10, 104–114, doi:10.1037/0278-7393.10.1.104.
[38]  Shepard, R.N.; Romney, A.K. Multidimensional Scaling: Theory and Applications in the Behavioral Sciences; Seminar Press: New York, NY, USA, 1972; Volume I.
[39]  Kruskal, J.B.; Wish, M. Multidimensional Scaling; Sage University Paper series on Quantitative Application in the Social Sciences 07-011; Beverly Hills and London: Beverly Hills, CA, USA, 1978.
[40]  Shepard, R.N. Towards a universal law of generalization for psychological science. Science 1987, 237, 1317–1323.

Full-Text

Contact Us

[email protected]

QQ:3279437679

WhatsApp +8615387084133