Fig. 2(c). Jump connection Fig. 2(d). Jump connection
and negative feedback
Fig. 2. Generation Effect on Test Data (The clear picture is
input, the label of the picture is the result of classification and the
blurry picture is the output of the network.)
Fig. 3. Round function and jump connection and negative
feedback
In order to generate the output which is very similar to input, we can
use jump connection(which is in Fig3) and negative feedback. The output
of one layer of abstract network which can be seen as the knowledge of
features you have learned before is connected to the input of the
symmetrical layer of the concrete network, then take the mean as the new
input. If the output of layer two is B, the input of layer five is B, is
the error because the inverse function which used before is the
approximation function. So, the error decreases. The more jump
connection(more knowledge about features), the less training time, the
more similarity. It fits the process of learning. Fig. 2(c) shows the
result of layer next to output layer of abstract network connecting with
the layer next to input layer of concrete network. Fig. 2(d) shows the
result of jump connection and negative feedback. We can see that the
background is no longer dark when we use
negative feedback. Why is feedback?
Inspired by principle of automatic control, I add negative feedback
which is shown in Fig. 3. Because the whole network is like the
proportional integral differential part of automatic control, our aim is
to make output very similar to input. So let the the difference of input
and output as the new input will decrease the difference of output and
input. It is important to note that negative feedback only works when
jump connection or regression .Because the error is small, when it
propagate to softmax layer, the output of softmax layer will be close to
0, then it stop propagates.
The following is my thinking about this network and mind and
consciousness of mankind. The abstract network is like abstract thought
and the concrete network is like concrete thought. Meanwhile the
concrete network is like memory. It generates input image in the brain,
its’ output can be seen as consciousness(the look of input in the
brain). Man can remember only through the generation of concrete
network. For example, I had been somewhere several years ago, when I am
there once, I realize that I went there before. Because the old scene
was generated in my brain, the concrete network has the parameter of it.
When the new scene is input into brain, it generates old scene through
the concrete network by the parameters, then takes the intersection of
the old scene and the new scene. We went there before because of the
result of intersection is large. Memory is always Common sense(The
information remembered can be seen as common sense to individual). Lethe
is that when new knowledge input, the training process makes the
parameters change. The more new knowledge, the more changing, the more
forgetting.
The Application
This network can be used by deep reinforcement learning. It is shown in
Fig 4. Because we add common sense and input’s union as the new input,
so the logic it output will be more accuracy than before. For example,
one question about the Pythagorean theorem is to be solved. Such as
right triangle. The network about abstract network and concrete network
can be pretrained by the Pythagorean theorem. Its class is the
Pythagorean theorem when through abstract network, then output through
concrete network.With the union about and ,the question can be solved by
common sense added. The network can also be used for language
translation, zip and unzip, encryption and decryption,compile and
decompile, modulation and demodulation.