Fig. 6 The diagram of NAM attention mechanism framework
⑵ RetinaNet model refinement
As a lightweight, high-performance attention mechanism, the NAM
attention mechanism is plug-and-play. The NAM attention module is
inserted into the tail of the ResNet-50 and connected to the head of the
multiscale object detection algorithm: FPN to form the attention module,
as shown in the yellow part in Figure 7. Thus, the improved RetinaNet
model has been formed.