Chapter 33 Software and Hardware

The new human civilization has had an idea more than once: to create a legendary strong artificial intelligence!

This strong artificial intelligence is self-aware, and can optimize itself and improve the algorithm to achieve infinite self-renewal iterations. In this way, it can increase one's IQ infinitely.

Such a species can bring about a "singularity era" of technological explosion in the short term.

The idea is indeed beautiful and has been around since the time of the earth. Regardless of the security, let's think it's safe for the time being.

People fantasize about the arrival of the technological singularity, a strong artificial intelligence that can bring immortality to all human beings! With infinite intelligence, immortality is not an easy task?

All sorts of big names kept boasting, they calculated a countdown to a better future, thirty years to the age of the singularity, twenty years to go, and so on......

But the reality is bad.

With the development of science and technology, especially into the interstellar civilization, after the explosion of science and technology, people are more and more inclined to this kind of mechanical civilization that can evolve infinitely, and it seems that it does not exist......

In other words, you can only create species that are "stupider" than you, but you can't create species that are "stronger" than you!

This statement is easy to explain.

"We're using our brains to study our brains." Yuriko pointed to her head with her finger, thought for a long time and said, "What do you think...... Our wisdom, can we understand our wisdom thoroughly......"

Not a tongue twister, but a ...... Philosophical level questions.

People's understanding of the brain is based on human thinking. Thinking, on the other hand, is based on the brain itself. Similarly, if we write a program that can write a program, can that program write a program as complex as itself?

If human beings are a naturally occurring program, can it fully discern itself?

"To paraphrase a doctor: if our minds are simple enough to be understood, we are too stupid to understand it; Conversely, if we are smart enough to understand our brain, it will be too complex for us to reveal. ”

"We've progressed, and so have our brains."

"If we can't even understand ourselves, let alone create a stronger species from the ground up."

"So I think that strong intellectual barriers are indeed likely to be a huge problem that is difficult to solve. Fertility screening can raise the upper and lower limits of intelligence, but it cannot pierce it. Because it is not a matter of intelligence at all. ”

"We, even if we become transhuman, probably still haven't made it."

Hearing her explanation, Yu Yifeng's heart was gloomy, this question has risen to the level of philosophy, which makes people feel very helpless.

This helplessness is a sense of powerlessness from the depths of my heart.

No matter how smart and powerful people are, they can only create weaker "artificial intelligence", and it is difficult to achieve infinite iteration.

The strength of a weak AI lies only in its computing power. Assuming that its computing power is the same as that of the new human, it will undoubtedly be completely destroyed in every way.

Thinking further, this paradoxical problem can be extended to all species in the universe on a wider scale.

Suppose that one species A can easily create another, stronger species B; So is it easy to create a stronger species C?

Then C creates a stronger D......

D re-creation F......

And so endlessly iterating, all the way to the later species N, how powerful will it be? How smart?

If this were the case, the universe would have been full of "god-like" creatures for a long time!

But the starry sky was still so dim, the "gods" did not run around, and there was no "Virgin" spirit to save all things.

In other words, the proposition of "creating a stronger species" is extremely difficult in itself, or is it simply a false proposition, and it is difficult to do it within the known scope!

Again, this issue can be re-promoted.

The universe is a tool for us to do research, and everything we can think about and experiment with is based on the basic laws provided by this universe.

At the same time, the universe is the object of our study, we study the birth and destruction of the universe, and why the universe is the way it is, and not something else.

That is...... Our research, including our imagination, is itself provided by the universe. To be able to approach is already a remarkable achievement.

"According to this theory, is the world unknowable?" Yu Yifeng said bitterly, he thought of all kinds of YY that exploded with one punch.

Yuriko whispered, "Without further ado, let's not dwell on these broader things...... Strong barriers are just a conjecture and have not been thoroughly proven. Our thinking and self-awareness is really a black box system, and we don't know how to study it, and we don't know how to cut in. ”

"But I think it's possible to find out some of the things and make a small improvement to the content even if we continue to accept input and output."

Or...... We can turn to the unknown, what we don't understand, to improve. ”

Seeing Yu Yifeng's thoughtful appearance, she smiled slightly: "Such a strong intellectual barrier is not a topic for L4-level civilizations, or even for interstellar civilizations...... Let's not overthink it just yet. ”

"Those who have crossed the strong threshold can already be called god-level civilizations. We temporarily doubt the existence of this god-level civilization...... Sowers, yes? I don't know. ”

"What the L4 civilization faces is only a weak intelligence barrier, which can be solved clearly."

Specifically, strong barriers involve logical boundaries, paradoxes, mysteries, self-awareness, and the problem of the brain's instruction set.

The "weak barrier" is much simpler, just a biological puzzle.

The foundation of wisdom is the brain, but a brain is only equivalent to hardware, and it is not equivalent to wisdom at all.

At the moment of death, the time of 0.0000000001 seconds, theoretically, the physical properties of the brain have not changed drastically, but why did people die like this? Is human death instantaneous or continuous?

If it's instantaneous, it's likely that something "soft", like self-consciousness and something like that, hangs up, so the person dies.

Strong barriers describe this kind of "software" primarily.

At a certain moment, the "software" hangs up and crashes because of a hardware error, so the person dies.

The physical structure of the brain itself, as a "hardware", is biologically optimizable, and it can be seen and touched, and it is not a black box system like "self-awareness".

"There's no paradox in using software to retrofit hardware."

The human brain is a messy piece of device, inefficient, clumsy, and esoteric, but it still works. By every measure, the brain is a poorly designed, inefficient mass that surprisingly works well.

From a biological point of view, there are some clumps, because of the imperfection of evolution, which can indeed be improved, which is the origin of the "brain chip" that people currently use.

But the current brain chip is far from enough, it can increase the amount of computing, but it cannot make people smarter.

"The brain developed a specific solution to a problem a long time ago, and people have been using it for many years, or improving it for other purposes, and many different kinds of wisdom have been formed. In the words of the molecular scientist FranΓ§ois Jacob, evolution is a tinkerer, not an engineer. ”

Whether it is evolution or technological means, if you can give full play to the ability of the brain itself, that is, ...... Passed the "weak barriers".

Fertility screening is a common modality through weak barriers.

"Hardware is the foundation of wisdom, and software plans the boundaries of wisdom. It is impossible to play chess with a software that plays Go, this is an algorithm problem. ”

"The human brain has more complex software than any computer program, it's very powerful, it can do almost anything, it looks perfect, but it actually has its own boundaries. We can't take into account, we can't think of it, we can't feel it. ”

"It's precisely because this set of software is too strong, and the hardware is too weak."

"In other words, strong barriers are the 'software' of intelligent life, the boundaries of thinking, including unsolved mysteries such as self-awareness, and there is basically no possibility of self-improvement."

"Weak barriers are just hardware ...... It's the physical structure of the whole brain, and improving the hardware is indeed something that can be done by various means. ”

As long as there is a way to make the hardware more powerful, more suitable for software, and gradually bring out the maximum performance of "software", it can be said that the weak barrier has been passed.

If the weak barrier is completely passed, which is equivalent to the full development of the potential of "software", how smart will this life be?

"However, just this threshold, a weak barrier, is already as difficult as reaching the sky, and it has stuck the vast majority of interstellar civilizations!"

Yu Yifeng sighed: "I don't know, we have passed the weak barrier or not...... The theoretically superhuman brain is far from fully developed. Well...... Also, has our software been improved by Perfection? ”

After discussing this, the two of them returned to the office with their hearts in mind.

There's a lot of work ahead of you......