Chapter 13: A Preliminary Study of Scientific Logic
The falsifiability, the inverse of verifiability, determines the concept of science through the formation of demarcated boundaries: the notion that the network is innately correct but we cannot grasp the details is in fact a referent, based on the assumption that there is a theory in the world that can accurately describe the world. Pen @ fun @ pavilion wWw. ļ½ļ½ļ½Uļ½Eć This kind of self-referential is dangerous, and it is easy to form a paradox like "this sentence is wrong". But it is precisely the beauty of this coupled structure that attracts me, that the network should have this coupled structure, and the contradiction is the embodiment of its high-dimensional structure. It is even an atomic structure that is statistically described on a larger scale. Finally, we use the results of our observations to construct the network structure.
The description of the network tends to use the language of probability, and the concepts of always and all are always avoided (this is a small contradiction, but it is normal in the high-dimensional structure of the mind), and the network can only draw the probability of connecting another piece of information to the existing information: for example, the swans we observe now are all white, we cannot assume that all swans are white, but according to the view of the network, the probability of using the Bayesian formula is 100% that the next swan we encounter is white. But this probability is dynamic, and in a sufficiently long space-time dimension (the law of large numbers), we are ultimately able to infer probabilities based on frequency.
The logical computing system of the network is not only a syllogism logic that summarizes old knowledge, but also a combinatorial logic that derives new knowledge: induction and deduction are different levels of the network, and they use similarity to establish connections between different concepts. The conceptual understanding of the network is the probability of a node collapsing to form a path. Summarizing the basic level and architecture of the network, deduction is the selective expression of the network, and they are both an attempt at probability, because their similarity exceeds a certain threshold, so they also provide the possibility of error
Induction can be infinitely iterative in the ideal field of mathematics, and when it comes to reality, there is bound to be a certain amount of resistance, so that it will eventually converge. This convergence range is the average distance of the network.
Deduction needs to be based on a certain pattern in order to initially construct a picture of the world, and its combination mode is the selective expression of the network. Deduction is actually the use of the average distance of the network that is like six degrees of separation to construct relationships that originally existed but were not incorporated into the structure of our thoughts. This average distance is related to the effective boundary of the function
The basic elements are finite, but their combinations are infinite, and the final meaningful combinatorial sequence or pattern is limited. It is also an illustration of "nothing new is happening under the sky".
Construct the low-dimensional structure of the network, the sequence (which is the basic proposition of logic), and use its traversal at different levels to construct the original network structure. According to the similarity, that is, the matching degree of the sequence, the logic we construct is the product of a concept, and the effective path can only be expressed by constantly adjusting the probability in the use of reality, that is, according to the experience to make the idea close to reality, and finally according to the structure of the concept after adaptation to the environment, new knowledge can be constructed. The most successful example of this is modern theoretical physics
A network is a network, and a network is just a network. Like Lao Tzu's Tao, it is a belief in vague thoughts. Our perspective determines what we can get.
There is also a certain similarity between the projections of different dimensions of the network, and the levels can be converted through certain mathematical processing, but this is theoretical, and the real situation is the probabilistic expression of the network.
The network abandons causality, and different concepts are connected by probability, and only the connection between the high-dimensional concepts of the set of concepts is the accumulation of probability, which may be close to 100% of the connection. These quantum-level connections are relatively deterministic, but the concepts we generally understand are high-dimensional, i.e., the probabilistic connections between these concepts are fluctuating. This is enough for ordinary people to establish a system of causality (bundled conceptual connections), such as Newton's classical mechanics system, although it is not strictly valid from a high-dimensional point of view, such as Einstein's theory of relativity.
Energy minimization, hierarchical game, and the coupling structure of the network, thus showing a certain distribution
In essence, the world is ungraspable, experience and cause and effect are the result of man's search for an illusory sense of security, so the theoretical edifice of mathematics and physics is always unstable, and we are constantly trying to repair it, just like the hero in Greek mythology who is always pushing the rolling stone, seemingly doing useless work, but in essence maintaining the existence and continuation of the overall network.
Knowledge is a combination of specific sequences of networks, multiple levels of communication networks that are part of the differential equations that construct the connections
Relativity is a characteristic of the movement of all levels of the network, and the proportion is equal to the probability, which is the expression language of the network
To describe the nature of a network, probability can be understood by the frequencies derived from long-term observations. However, for the occurrence of a single thing, the probability of occurrence can only be achieved through enough hierarchical coupling, but in any case, there is no absolute. This is the same source as Heisenberg's uncertainty principle, any level is the diffusion of the distribution function of the network, and the atomic property ignores the probability distribution at both ends of the distribution, which we can only understand from statistics
Rational allocation of resources, so that the marginal benefit is greater than 0, that is, the local optimal path is constantly selected, and the final result must be around the discrete state of the overall optimal intrinsic nature
The eigengens are the path of the probability collapse of the network, and only by matching its sequences can the maximum benefit be obtained
According to the guidance of experience, the central node of high connectivity, that is, the key problem, is found, and then decomposed into various modules, and then integrated into a certain mode to form a certain network
Valuable things are inversely proportional to the number, which is a power-law distribution
The truth or falsity of a proposition can be regarded as 1 and 0, and the continuous operation of a long string of propositions can finally lead to a result, which requires us to assign a value.
When the content and form of a proposition contradict each other, such as the internal negation of the external form as a whole of "I'm lying", as a closed system of circulation, we temporarily see it as a coupling point, which cannot be calculated and cannot be directed at itself.
Logical reasoning contains a certain amount of non-computability.
Composite propositions can be split, but this process is dimensional-reduced, and a certain amount of information will be lost, so atomic propositions are low-dimensional.
If...... Rule...... pāq is false only if p is true and q is false, and the idea embodied in it is absolutely wrong under the right conditions.
There is a limit to the steps of the operation, and iteration will bring chaos.
Can the absence of conjunctions be considered probabilistic operations?
or the operation is true at the quantum level at the same time, which shows that the atomic proposition is not basic enough.
Sets, the logical operations of sets of sets should not be equivalent.
The presence of a map represents a logical correspondence
After the equivalent substitution of the formula, its operation can be transformed to a certain extent, so that the result is not equivalent, because the effect of the operator is limited.
In other words, the double negation is still itself. I don't think it's quite, there should be progress
(2 (1 error is always wrong 1), (3 this sentence is true 2), this sentence (judgment) is wrong. 3ļ¼
;