General Information Theory: Time and Information
This paper introduces time into information theory, gives a more accurate definition of information, and unifies the information in cognition and Shannon information theory. Specially, we consider time as a measure of information, giving a definition of time, event independence at the time frame, and definition of conditional probability. Further, we propose an analysis method of unified time measure, and find the law of information entropy reduction and increase, which indicates that the second law of thermodynamics is only the law at a certain time measure framework. We propose the concept of negative probability and information black hole to interpret the conservation of information in physics. After the introduction of time, we can give the definition of natural variation and artificial variation from the perspective of information, and point out that it is more reasonable to use the mutation to represent the neural network training process. Further, we point out the defects of the existing artificial intelligence.
READ FULL TEXT