Sharing Weights in Shallow Layers via Rotation Group Equivariant Convolutions

Based on RGEC, the team of Prof. Huiguang He from the Institute of Automation, Chinese Academy of Sciences has designed novel networks that share the kernel weights of different orientations in the shallow layers. Experimental results show that this approach requires much fewer kernels and parameters in the shallow layers when maintaining superior performance. Compared with conventional CNN, it keeps the same output channels and brings no additional burden in computation. It shows that the convolution kernels in the shallow layers can benefit from the rotation symmetries a lot by RGEC. Fewer kernels are more intuitive and interpretable, leading to fewer potential generalization risks.



Convolutional neural networks (CNNs) have developed rapidly in the past two decades. In various tasks of computer vision, including classification, detection, and semantic segmentation, CNNs have achieved excellent performance.

 

CNNs have translation group equivariance and share weights in different positions. Compared to fully-connected networks, CNNs have significantly higher parameter efficiency due to weight sharing, and can better resist the influence of translation, thus achieving better performance. Inspired by this, this paper aims to achieve better performance with an even higher degree of parameter efficiency.

 

This paper exploits and explores a novel method to take more advantage of the rotation group equivariant convolution (RGEC). In addition to improving the efficiency of the network parameters, this paper also ensures that no additional computational burden is introduced.

 

To this end, this paper keeps the output channels of the RGEC the same as those in the ordinary convolutions, rather than increasing the output channels to keep the number of parameters alike. Therefore, no computing resources are increased, whereas the kernels and parameters are much fewer.

 

To avoid introducing extra computational cost, this paper constructs an arbitrary shape convolution, which can rotate the convolution kernels directly and conveniently.

 

Considering that the RGEC is more efficient for the low-level features, this paper constructs networks that share weights of different orientations only in shallow layers.

 

With fewer parameters and no extra computational burdens, a non-maximum-suppression loss on the orientation dimension is designed and added to improve performance.

 

Extensive experiments demonstrate that sharing weights for different orientations in the shallow layers can improve the performance with fewer parameters and no extra computations when utilizing a drop-in replacement for conventional convolutions.

 

 

Download full text

Sharing Weights in Shallow Layers via Rotation Group Equivariant Convolutions

Zhiqiang Chen, Ting-Bing Xu, Jinpeng Li, Huiguang He

https://link.springer.com/article/10.1007/s11633-022-1324-5

https://www.mi-research.net/en/article/doi/10.1007/s11633-022-1324-5

  • Share:
Release Date: 2022-05-10 Visited: