Towards a Semantic Theory of Information
AbstractInformation can be understood as that which reduces uncertainty, no matter what origin it has. In the field of human communication, information is only meaningful if it is part of a finished or intentional action. Meaning should be gathered from the empirical perspective of the use of language. If we study the processing of signification through transmission of the normal use of language, we will see that it takes place communicating a set of prototype categories, the core or central facts, which defines meaning as empirical hypothesis. But if there are central facts showing the use of words, then other facts –more or less peripheral– should also exit, whose knowledge is necessary in order to communicate in contexts far away from the “denotative conceptual norm”. Hence meaning can be represented by a fuzzy subset of the universe of discourse partition set. This concept of meaning may be integrated in a formal model of semantic source and information may be measured by non-probabilistic entropy.
tripleC is a peer-reviewed, open-access journal (ISSN: 1726-670X). All journal content, except where otherwise noted, is licensed under the Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Austria License.