Chapter 72: The Wild Thoughts of Internet Philosophy (2)
Given the outside temperature T, statistical physics can conclude that the energy distribution of gas molecules is the case, that is, to construct the relationship between macroscopic and microscopic, with probability as the measure of distribution, that is, relative proportion
The correspondence between the path integral and the distribution function Z
Different levels of elephants are different and are represented by different functions. Again, these are couplings of different functions
The principle of minimum action can be understood as the phase iS of the path with the larger action cancels each other out (interference by waves), leaving only the path with the least action (relatively independent), which has a high probability.
The ISING model is a good simulation, and both the whole and the local tend to reach an equilibrium, and there is some competition between the two types of equilibrium.
The basic computation is a simple logical judgment, and if a sufficient number of attempts are made, certain properties can emerge, such as the Monte Carlo algorithm for the statistics of meaningful Markov sequences (similarity based on sequence matching has a certain function). Artificial intelligence, on the other hand, is similar to the human brain in that it is based on statistical pattern recognition, which can perform fast matching and eigen-computing. This is related to the multi-level coupled cellular structure of the brain's nervous system, and its basic connection construction is a high-dimensional calculation, and the relatively low-dimensional is the flow of electric current. Because the granularity of these calculations is so small, we treat them as an atomic behavior and then leverage statistical understanding from the network's point of view. It could be predicted that such a computer is equivalent to a general-purpose Turing machine. This I see as a pattern of logical operations on the network.
Coupling, or self-referentiality, is a manifestation of the incompleteness of the network system, indicating that it can expand indefinitely. This is the same as the fractal worldview of the web. Many levels are isomorphic to each other, and are structures that are selectively expressed at other levels, which can be derived from topological invariants.
The collection of information is like the collection of terms of the Fourier series, and then approximate to converge.
The coupling of the hierarchy can be expressed as convolution, which is also expressed in the language of probability, and is expressed as a combination of distribution functions, which contains all the information of spatial correlation.
Information has a relative meaning, which is a relative proportion of a particular level. But at the statistical level, it is possible to ignore relativity and express it as a measure of certain absolute information, which is Shannon's binary encoding. Although we know that the distributed nature of the network makes the connection between objects at different levels limited, such as six degrees of separation, it is possible to adopt a specific pattern to deal with the problem at a specific level. But the information itself is coupled to the path of information transmission (according to the network's point of view), i.e. we need to consider the channel at the same time.
How to continue to construct a statistical-level network on the basis of information theory? Construct a certain sequence according to the relative position of two basic nodes (multi-level coupling), and construct a network relationship according to a certain similarity in the process of ascending dimensionality.
I think the various processing of images is relatively close to the thinking of the Internet. The information of different scales is represented as a function of different frequencies of different levels of information that we recognize.
The leap from analytical mechanics to network mechanics analysis: (No idea for the moment, but I know that this is an inevitable development process.) Perhaps we can refer to the point of view of quantum mechanics, or even quantum field theory)
Newtonian mechanics solves differential equations, the mechanical analysis of the network treats the hierarchy as a variable, and can also take the eigengens of this distribution function as a variable to construct a certain complex relationship, that is, the differential equation, which is the coupling between the levels and can be regarded as a functional
Euler's equation describes the relationship between levels: the low-dimensional treatment of the higher dimension equals the higher dimension of the lower dimension.
The Lagrangian method is actually an understanding of the high-dimensional structure, and according to its special value, it is solved in the low-dimensional situation. Finding the extreme value of the function is a sequence of convergence to meaning, and when the differentiation between the levels is divided into 0, that is, a margin, a kind of convergent boundary. The principle of minimum action
Hooke's law: The network representation of f=-kx is a distribution function of the probability of motion, describing the linear eigenrelationship. Regarded as two interconnected spring oscillators on a smooth plane, the length of the springs is the probability, and their motion follows the conservation of momentum, i.e., the motion is proportional to the relative proportion of the quantity.
Different formulations of the same eigen, according to new differential equations. The result can be expressed as a coupling of many periodic functions (Fourier series).
This is a description of geometry (hierarchy of convergence).
Gravity can be expressed as a constant tendency, i.e., the processing of probability
Force can be written as the first derivative of the potential function for the position, and probability is the first derivative of the distribution function for the position.
Noether's theorem: Symmetry in a physical system corresponds to a conserved quantity. This is the confidence in which we construct a philosophical theory, like the equivalence of a computer and a general Turing machine, when its combination is 0, it is a conserved quantity, and its combined form is a meaningful sequence.
Hamiltonian mechanics is the processing of global information, and Hamiltonian is the description of the state quantity of a system, that is, the whole.
Network mechanics is a simulation of quantum mechanics. Based on linear algebra, the vector inner product is a traversal of possibilities that can emerge in meaningful patterns based on their adaptability to the environment. Operators are a kind of processing of high-dimensional patterns, and there is also a certain distribution at this level
The eigenconcept of a network is derived from the eigenvalues of linear algebra and is a description of the whole.
The Schrödinger equation is a coupling of different levels, and its combined form and certain sequence correspond to a certain high-dimensional structure.
The stable structure is the emergence of patterns, and the two are mutually causal and are a coupled dynamic equilibrium.
The essence of peripheral inhibition (like apical bud inhibition): each neuron uses peripheral neurons to make predictions about its perceptual inputs, and this simple prediction converges extremely quickly, while the network tends to have the inputs consistent with the predictions, but requires the individual to adapt to the dynamics of the environment.
A neural network is a construction of a probabilistic network, and the weight distribution and probability distribution of the results are based on statistics.
Multi-level coupling, convolution between layers
The probability of occurrence of different states is different, such as the probability of high energy state is low, the probability of low energy state is high, and the final expression is the selective expression of these two relatively independent levels, which is the Nash equilibrium state.
In thermodynamic equilibrium, the probability that the network is in a state depends only on the energy of that state: the probability is proportional to the energy negative exponent, and the higher the energy, the lower the probability of occurrence, which is a power-law distribution
The relationship between quantum mechanics and group theory is a premise for us to explore network relations. Group theory can be regarded as a higher-dimensional model of linear algebra, which is the equivalence of eigeninformation to construct new relations, such as the reciprocal relationship between the position operator and the momentum operator is: [X,P]=ih (the equivalence of transformations)
A group is a set of self-consistent rules of operation: there is a unit element, there is an inverse, and there is a associative law of multiplication
Abelian group: If any two elements satisfy the commutative law, it is considered an ideal economic system with a transaction cost of 0
Symmetric transformations can be represented by groups, and the elements of groups can be mapped as matrices.
Probability is the relative proportion of a matrix. Just as the imaginary number i enters physics, there are certain latent variables in probability, which are the set of higher-order derivatives of its effects.
The superposition state is the average of the area of the function
Thinking in terms of the dimension of the network, there is no such geometric transformation as rotation, but a more high-dimensional network structure that we cannot understand with common sense: distribution.
Wave-particle duality is the description of the network, and the specific experiment is the result of its selective expression, and the equivalence of matrix mechanics and wave dynamics is the construction of the relationship between the basic mathematical form of the network, linear algebra, and the wave form of probability. The uncertainty principle is a description of the distribution of a network
Quantum computers using the principle of superposition to achieve truly large-scale parallel computing are an annealing effect that exhibits the results we want. (There is a fear that our world may lose interaction with other parallel worlds and eventually go to ruin)
Euler's formula can be expressed as a distribution of probabilities.
The introduction of new levels triggers interactions that create a certain amount of uncertainty
Understanding computation in the sense of the relationship between symbols is a high-dimensional structure that relatively performs basic quantum computing, which is the emergence of a pattern of networks, and we are confident enough that it is correct. For example, the logarithm is a kind of mapping, and its rate of change is itself, which is a coupling, such as the unit element of a group.
Feedback is the basic model, and its selective expression at different levels can construct correspondence (Fourier series) with all theoretical continuous functions, and a certain logical structure can be formed between different levels, that is, the formation of various logical elements. A high-dimensional mechanism emerges through local interactions.
The neural network assigns connections between the hierarchies, and eventually forms pathways.
The economic modeling method and its key technologies may provide us with inspiration for building networks, multi-level dynamic games, starting from the analysis of the behavior of basic interactions, using relatively simple program rules in the computer to build models, and allowing a large number of interactions to generate the system from the bottom up, and finally using the emergent properties in the system to map and explain the laws in reality.
There are different levels of emerging patterns, and the combination of different patterns at different levels can describe different systems.
Equilibrium can only be achieved under certain special conditions, and the fluctuation of change within a certain interval is the general result of the system, and this non-equilibrium state is the result of a multi-level game, a limited manifestation of infinite combinations, and can have unlimited adaptability to the environment (within a certain range). This equilibrium ideal state and realistic non-equilibrium distribution are the high-dimensional structure of the network. All kinds of equilibrium are relative, that is, they can only exist within a certain range or sequence. And reality is nothing more than choice, which is network behavior.
To integrate micro and macro economic phenomena, we need to be like the underlying building of quantum hypothesis, and we consider the analytical structure, which can form a certain dissipative structure, and can describe the dynamic evolution and emergence of economic systems
The evolution of the network is self-similar to the structure of the network.
The effectiveness of the game is limited, because the distribution of information is distributed, so that the local game is based on the multi-level distribution of the network. This is like the ideal assumption of the rational man in economics, just as the existence of a resistant change in friction allows it to converge, just as each term of the Taylor series can be regarded as a game of a certain level. The change of its coefficient is based on the variation of a certain transfer matrix.
The existence of the distribution is based on various choices, and individuals are able to form a certain pattern preference, which can then form a certain path dependence. This distribution structure is also self-similar, and certain positive and negative feedback mechanisms can emerge at different levels. A multi-level game not only considers the internal game, but also considers the influence of the external environment. For example, the Nash equilibrium of the prisoner's dilemma is the worst for both sides of the game, but when we consider the influence of social level, we will find that this is good news. There is a certain law of conservation here, and interests are conserved. The large level of society can transfer the interests of individuals at a small level.
As long as certain rules are established, there will inevitably be the emergence of high-dimensional properties, and we should pursue a model that has great similarity to the real network. That is, the properties of these emergences can in turn have a certain impact on the underlying building and form a coupling structure.
The influence of various factors can be expressed as a certain transition matrix that affects the probability.
The network only emphasizes the interaction of various levels, and the combinatorial transformations of different levels that can be formed are emergent and topologically equivalent. Therefore, it doesn't matter how you connect, because it will eventually form such a topologically equivalent network model. The isomorphism and invariance of relations, the result of the selective expression of hierarchies. This is what we generally think of as the return of all things.
Interests are multi-layered, which is the inevitable differentiation of network distribution. There are ordinary people who focus on their own interests, and there are philosophers who have the world in mind, and it is normal to have these distribution and differentiation. Of course, in essence, ordinary people pay the same attention to the interests they recognize as philosophers, and because of the multi-level interests, the network is formed.
The formation and memory of networks, which are high-dimensional structures, like genes, can have a certain impact on low-dimensional structures such as molecular pathways. The coupling structure is reflected in similarities in behavior, learning, language, and so on. Just as the mechanism of action of the brain is the result of multi-level coupling, such as mapping in different brain regions,
The distribution of the distribution, i.e., the coupling of the power-law distribution and other forms of distribution, i.e., selective expression
The God of the West is the Word of the East, a holistic description of all the rules of the universe.
A description of the coupling of the Bayesian formula, which reveals the relative proportional relationship of the higher dimensions. This is similar to the feedback loop of a circuit system: the transfer function = HG/1+HG. It has no effect on the order of events, but we can infer the probability of the order of events from it. How to express it in matrix form? Different nouns are defined and corresponded to in binary, and then combined into a certain determinant. The transition matrix of the Markov model may be a model (convergence, hierarchical division). How to make it continuous can be treated with a certain amount of calculus. Combined with graph theory.
Multi-level coupling and games, the construction of neural networks, and the expression of local probabilities are matrix-like behaviors. The coupling of hierarchies can be represented as stacks of functions.
The nature of the Ising model is an emergent statistical hierarchy, as is the case with simulated annealing for path choice collapse. Genetic algorithm is an equivalence, and the construction of artificial life is a self-coupling self-similar network system.
Network applications of mathematical logic: first-order completeness (all logically valid formulas in first-order predicate calculus are provable, which is a treatment of the intrinsic properties of the network) and high-dimensional incompleteness (first incomplete theorem: let system S contain first-order predicate logic and elementary number theory, if S is consistent (no contradiction), then the following T and non-T are not provable in S. )
Mathematical principles and formally undecidable propositions in related systems are attempts to logically understand the complex structure of networks, multi-level coupling. But what we can do is only to construct relationships in low-dimensional cases, and there is a great risk of failure to traverse ascending dimensions, so we need other levels of coupling to make our model closer to reality, such as experience, and so on.
The paradox is a testament to the coupling fractal structure of the network, and the undecidable Turing machine downtime problem is a limitation to the overall description
First-order arithmetic systems require a coupling structure like group theory: unit element, reversible operation i.e., commutation, and the inclusion of the result of the operation.
The correctness of mathematical induction should be finite and convergent, like the concept of limits.
The application of Lamarckism at the gene expression level, the Darwinian network of variation selection inheritance, the language measured by information, the selective expression of topology, the similarity and stability of the multi-level game and network formation of the economy and society, the general growth pattern of the analytical structure, the self-similarity and self-replication, the equilibrium state is a transition, the evolution of organisms is a variation embodied in the hidden structure level, and finally accumulates to a certain threshold to have abrupt changes, large-scale emergence based on interaction, if we can trace things back to a very fundamental level, we will find that everything is the same, and any level is a selective expression of other levelsThis self-coupling self-similar structure is a holistic description of the universe. And the dignity of life lies in the exquisiteness of its combined forms.
The boundaries of the world are information, and the world outside of our observation range is meaningless to the individuals we observe. However, we have many observers, and the exchange of information between us can make us believe in the existence of the world, and existence is actually a sense of certainty, the elimination of uncertainty.
The nesting of functions is the fractal structure of the network.
Understanding the world in complex numbers is a kind of hierarchical depth, but we should pay attention to the fact that this depth is convergent, that is, it does not satisfy the recursive induction of mathematical induction, such as i=√-1, can we continue to take j=√i?, this is a hybrid coupling, due to the finite average distance of the network, we are confident that it can converge, like the Taylor series, we take the terms of the first and second order is enough.
A network is a set of relational structures, a hierarchy of acceleration, which can summarize the overall trend like gravity.
In conditions that cannot be repeated many times, i.e., frequency cannot be equated with probability, we need to synthesize more information so that meaningful information emerges as it emerges, which requires a wider space (to construct a many-to-one map), and we first introduce complex spaces, and then fractal spaces, i.e., selective representations of matrix spaces.
ψ=a+bi, a is the observable probability, b is the hidden space, and behind the probability there is a more basic complex probability that restricts the probability itself.|ψ|As the modulo of the ψ, it is the result of observation in a specific process.
Euler's formula e^ix=cosx+isinx communicates complex and exponential spaces.
The basic events of the network may be incomplete, i.e. there are always exceptions, and the exponent is an equilibrium made by our choice of "like the Taylor series, where we take the terms of the first and second order sufficiently".
Coordinate system transformation, equivalence construction. The relativity of eigenvalues is related to their matrices of left and right vehicles.
The mechanism of feedback is formed, the result of the transformation of the matrix. A description of what a coupling pair does.
The achievement of the purpose is the reduction of the uncertainty of the information and the increase of the certainty of the purpose. Many paths form a network that has a certain intrinsic and is the path of the average distance of the network. Circuits are equivalent to Thevenin's theorem and Norton's theorem.
Time is reversible, as rigorous as a clock, and the future can be predicted with certainty only at a tiny level, and when there are many levels of coupling, it is likely to form a multi-level Nash game, thus showing chaos.
Time series, like Markov sequences, are the properties of networks with information, and we can understand their properties statistically.
A selective expression of the network, tree-like branches, and then there may be a certain similarity between the different trees, which can have a certain amount of competition, so as to form a certain equilibrium: either antagonism or cooperation. A probabilistic connection is the possible expression result of a branch. It has a certain statistical nature, the explosion and leapfrog development of the network.