当前位置: X-MOL 学术J. Phys. Conf. Ser. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Attention-based deep learning approach for CSI feedback under 5G TDL channel
Journal of Physics: Conference Series Pub Date : 2024-02-01 , DOI: 10.1088/1742-6596/2711/1/012001
Hanli Peng

In 5G communication systems, accurate channel state information (CSI) is indispensable for signal detection and regulation at the base station side. However, frequent CSI feedback from users leads to excessive system overhead. To tackle this challenge, this paper puts forward a novel deep learning framework - HCNet based on attention mechanism and autoencoder, aiming to efficiently compress and reconstruct the high-dimensional time-varying CSI matrices in an end-to-end approach. The proposed framework incorporates a self-attention module in the encoder to explicitly capture global dependencies within the CSI matrix. Meanwhile, the decoder adopts gated recurrent unit networks to fully exploit inter-feature correlations and redundancy. To evaluate the performance, simulations are conducted using datasets conforming to current 5G time-varying TDL channel models. Results demonstrate superior performance over existing deep learning based feedback networks. Specifically, the proposed framework can reduce the normalized mean square error of CSI reconstruction by 4 dB under various compression ratios which confirms the effectiveness of the attention-enhanced autoencoder structure for compressive CSI sensing and feedback in practical dynamic communication systems.

中文翻译:

5G TDL信道下基于注意力的深度学习CSI反馈方法

在5G通信系统中,准确的信道状态信息(CSI)对于基站侧的信号检测和调节至关重要。然而,用户频繁的CSI反馈会导致系统开销过大。为了应对这一挑战,本文提出了一种基于注意力机制和自动编码器的新型深度学习框架——HCNet,旨在以端到端的方式有效地压缩和重建高维时变CSI矩阵。所提出的框架在编码器中结合了自注意力模块,以显式捕获 CSI 矩阵内的全局依赖性。同时,解码器采用门控循环单元网络来充分利用特征间的相关性和冗余。为了评估性能,使用符合当前 5G 时变 TDL 信道模型的数据集进行模拟。结果表明,其性能优于现有的基于深度学习的反馈网络。具体来说,所提出的框架可以在各种压缩比下将CSI重建的归一化均方误差降低4dB,这证实了注意力增强自动编码器结构在实际动态通信系统中用于压缩CSI感知和反馈的有效性。
更新日期:2024-02-01
down
wechat
bug