Chapter 33: Conjectures on a New System of Logic
Abstract:According to the imperfection of classical logic, various new logic systems are proposed in turn, such as multi-valued logic, fuzzy logic, and so on. According to www.biquge.info the principles of logical algebra and the related principles of probability theory, and inspired by the sequence comparison of bioinformatics, a new logical system is established.
Keywords: logical system, sequence operation probability
Body:
introduction: The essence of classical logic is tautology, that is, the construction of connections between known events, and no new relationships are generated in the process. In the original system, the basic events originated from the real world, and the construction of the connection of events has broken through the scope of our normal human thinking, and at the same time can bring us enough inspiration. This is element-based logical reasoning, followed by combinatorial logic, which is a relatively higher-dimensional logical structure.
Method: Firstly, through the preliminary discussion of various non-classical logic systems, the source of inspiration is obtained, and then the logical system is introduced in detail.
Mathematical logic is the basis of operations
Multi-valued logic is the degree between (quantum (range 1/0) and maximum value (0. 5)) Consider multiple situations, which are the basis of probability, i.e. finite events. The next step is to determine the probability based on how often it occurs. Fuzzy logic, on the other hand, uses a more definite magnitude pattern for successive infinite events (as in the case of calculus) the properties of the whole (e.g., 0. The membership of a of 6 is a distribution function, which is expressed only when it exceeds a certain threshold, which is determined by the proportion, such as the first 3%), which is essentially the language of probability
Modal logic is a system that combines the necessity of classical logic and the possibility of probability theory, which is a relatively comprehensive logic
Normative logic provides the direction in which logic proceeds, temporal logic provides time horizons, Fernando logic provides inspiration to accommodate contradictions, contradictions can be seen as competition between levels, and Nash equilibrium provided by economics can be seen as a steady state. According to different adjustments, the steady state formed by it can be regarded as the intrinsic nature of the network
The scope of classical logic, the result of which is absolute, distinct, and unambiguous, such as the division of people into good and bad people. But the reality is not so simple, there is no absolute difference between good and bad people, good people do bad things, and bad people do good things are just a selective expression. Therefore, it makes sense to use classical logic at the quantum level, e.g. 1/0. However, our general level of action is macroscopic, which requires the continuous traversal of the quantum level to ascend to the macroscopic level. Here we refer to the development of computer languages: machine language - assembly language - high-level language, the most important of which is definition, such as ASCII coding. Certain sequences can function like language connectors, inflections, etc., and these sequences are selective expressions of larger sequences. To fully describe the overall state, consider using the wave function to describe the overall network, which in the specific case is a selective representation of the network, which can accommodate all kinds of contradictions, because a particular case expresses only one case, and when the situation changes, it can express the opposite of it. Because the network is the coupling of various situations, we can only describe it in the language of probability, for example, the probability of a good person is a priori that he does good deeds is relatively high, and the probability of continuing to do good deeds is also relatively high, but the possibility of doing bad things is not ruled out, and it can only be said that in a higher time dimension, the frequency of its occurrence is relatively small.
Specific sequences can occur as inferences, such as ifthen
Binary logic ranges, which allow the construction of circuit systems so that the network can solve all the problems inside. It can be translated into other logical systems, and can be seen as a constant collapse of probabilities
Absolutely correct information is no information, or all information. If the temperature may drop below 0 degrees, it is true, and information is a measure of certainty
The collapse of the network can be thought of as the eigenvalue solution of the matrix
An eigenis is a specific sequence that is able to express a nature similar to that of the overall sequence
A measure of correlation can only reveal a certain nature if it exceeds a certain threshold
Probabilistic network logic: (from the long-term thinking and discussion with classmates when disliked by it no logic, not no logic, just based on general logic for further thinking, a bit like from linear to nonlinear) (assumption: the network is the highest dimensional structure, and the logical system of all is its different path collapse, which is essentially the interaction of various factors)
The probability network is like a wave function to describe the whole model, we can only grasp the whole without observation, when we start from a certain anchor point, we can observe the network is a certain path, that is, the collapse of the probability network. Its inclusiveness of contradiction is reflected in the fact that the observations at different points are different, but all are correct. (Compatibility of multiple systems, the result of their selective expression.) Each observation is a wave function of the whole network substituted by a certain observation function, and the result of the coupling of the functions is the observation value, and different observation angles may be complex, refer to the simple function decomposition of Fourier analysis)
Computability of probabilistic networks: Since networks are multi-layered, we must decompose each level to the quantum level in order to perform operations. For example, the definition of a person is decomposed into multiple indicators abcdefg, if it is satisfied, it is regarded as 1, and if it is not satisfied, it is regarded as 0, so for unique individuals, it can be decomposed into a sequence such as 1010101. First of all, it is assumed that these indicators are equivalent, if there are 5 or more 1s who are considered good and less than 2 people who are considered bad, then the individuals in this range have a certain bias, which can be quantified according to the proportion. At the same time, different indicators can be assigned different weights, and there is no difference. The above is of the first order, and it is also our general classical logic. What I'm looking for is high-dimensional thinking, so let's come to the second-order, that is, sequence matching. Relevant bioinformatics tools and ideas can be introduced to explain.
Building a logical system: calculus is the basis for alignment of quantized sequences, such as the alignment of different combinations of 10011100101101 and 101001101101. Bioinformatics provides many mathematical tools, such as scoring matrices (vacancy penalty scores, match scores), BLAST algorithm, Needleman algorithm, Waterman algorithm, etc. The end result is a variety of possibilities, and the frequency of its occurrence can be understood using probability
It is necessary to consider the hierarchical competition and the achievement of equilibrium, and the sequence of similarity, similar structure, and similar function
The structural folding of the third-order logical reference protein is based on the combination of special elements sought by the second-order match
The rise of dimensions is essentially a selective expression of the probability network, a gradual collapse or convergence of infinite possibilities
The hidden Markov model is a form of probabilistic reasoning of the network, which uses the relevant knowledge of probability theory to study the change laws of various random natural and social phenomena, and effectively describes the quantity, and looks for the laws in the random data, that is, the intrinsic emergence of infinitely long sequences. First-order thinking is to look for sequence matches, which allow specific elements to emerge and thus construct high-dimensional structures such as the A-helix and β folds predicted by protein structure. The second-order mind is the state change of the infinitely long strip with reference to the Turing machine, which is a computational process
The matching of sequences can be regarded as the conclusion of a transaction, that is, the two systems have a coupling point, and this process can also be regarded as a feedback mechanism
The coupling of the conclusion and the proof process, all mathematical problems are decomposed into infinite sequences at the quantum level, we need to find the eigens is a probabilistic behavior, mathematicians have mastered a set of methods to find patterns, and there is a greater probability of finding the coupling eigens; in this regard, we can refer to various algorithms today
References: ??
;