loading page

It may be time to improve the neuron of artificial neural network
  • Gang Liu
Gang Liu
Zhengzhou University; Xian Jiaotong University;, Xian Jiaotong University, Xian Jiaotong University, Xian Jiaotong University, Xian Jiaotong University, Xian Jiaotong University, Xian Jiaotong University, Xian Jiaotong University, Xian Jiaotong University, Xian Jiaotong University, Xian Jiaotong University

Corresponding Author:[email protected]

Author Profile

Abstract

Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, and artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites’ information processing capacity. However, some recent studies show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. In this study, some dendritic modules with excellent properties are proposed and added to artificial neurons to form new neurons named Gang neurons. E.g., The dendrite function can be expressed as Wi,i-1Ai-1 ○ A0|1|2|…|i-1 . The generalized new neuron can be expressed as f(W(Wi,i-1Ai-1 ○ A0|1|2|…|i-1)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neurons, many networks can be tried. This paper shows some basic architecture for reference in the future. Up to now, others and the author have applied Gang neurons to various fields, and Gang neurons show excellent performance in the corresponding fields.
Interesting things: (1) The computational complexity of dendrite modules (Wi,i-1Ai-1 ○ Ai-1) connected in series is far lower than Horner’s method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete Fully-connected Layer. In other words, the parameters in Fully-connected Layers are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity. (4) ResDD(ResDD modules+One Linear module) can replace the current ANNs’ Neurons. ResDD has controllable precision for better generalization capability.
Gang neuron code is available at https://github.com/liugang1234567/Gang-neuron.