eduzhai > Applied Sciences > Engineering >

Knowledge Distillation-aided End-to-End Learning for Linear Precoding in Multiuser MIMO Downlink Systems with Finite-Rate Feedback

  • king
  • (0) Download
  • 20210506
  • Save

... pages left unread,continue reading

Document pages: 6 pages

Abstract: We propose a deep learning-based channel estimation, quantization, feedback,and precoding method for downlink multiuser multiple-input and multiple-outputsystems. In the proposed system, channel estimation and quantization forlimited feedback are handled by a receiver deep neural network (DNN). Precoderselection is handled by a transmitter DNN. To emulate the traditional channelquantization, a binarization layer is adopted at each receiver DNN, and thebinarization layer is also used to enable end-to-end learning. However, thiscan lead to inaccurate gradients, which can trap the receiver DNNs at a poorlocal minimum during training. To address this, we consider knowledgedistillation, in which the existing DNNs are jointly trained with an auxiliarytransmitter DNN. The use of an auxiliary DNN as a teacher network allows thereceiver DNNs to additionally exploit lossless gradients, which is useful inavoiding a poor local minimum. For the same number of feedback bits, ourDNN-based precoding scheme can achieve a higher downlink rate compared toconventional linear precoding with codebook-based limited feedback.

Please select stars to rate!


0 comments Sign in to leave a comment.

    Data loading, please wait...