登录 EN

添加临时用户

基于深度学习的毫米波波束管理关键技术研究

Research on Key Technologies of Deep Learning Based Millimeter-Wave Beam Management

作者:马可
  • 学号
    2019******
  • 学位
    博士
  • 电子邮箱
    ma-******.cn
  • 答辩日期
    2024.05.22
  • 导师
    王昭诚
  • 学科名
    信息与通信工程
  • 页码
    124
  • 保密级别
    公开
  • 培养单位
    023 电子系
  • 中文关键词
    毫米波通信;波束管理;深度学习;非独立组网;异构网
  • 英文关键词
    mmWave communications;beam management;deep learning;non-standalone networks;heterogeneous networks

摘要

为补偿毫米波信号的高路径损耗,基于大规模天线阵列的波束赋形技术被广泛用于实现定向传输,提高毫米波通信质量。当备选波束数目较多时,如何以较低的开销实现高增益的波束对准成为毫米波波束管理的核心问题之一。传统的波束管理方法难以对接收信号复杂内在关联、用户运动状态等非线性特征进行准确建模,由于无法全面地刻画基站和用户周围的无线传播环境而性能受限。得益于深度学习强大的拟合能力和自适应的优化范式,本论文采用深度学习从空/时/频域建模波束管理中的非线性和高维度特征,实现低开销、高效的智能毫米波波束管理。 首先,针对空间角度域功率泄露效应导致的非线性特征,提出了基于深度学习的角度域超分辨率波束预测方法,根据宽波束训练接收信号预测最优窄波束,并设计了自适应的高可靠宽波束训练策略。仿真结果验证了所提方法相比传统预测方法在实现更高波束增益的同时,显著降低了波束训练开销。 其次,针对用户高速运动场景下的时域非线性特征,提出了基于深度学习的自适应低开销波束跟踪方法。一方面,为适应不同运动速度,提出了基于和概率准则的跟踪波束选择策略,根据预测置信度灵活调整跟踪范围,相比传统跟踪方法实现了更鲁棒的波束对准。另一方面,考虑到波束失准在时域发生的随机性,进一步提出了融合深度学习和常微分方程的连续时间波束预测框架,仿真结果验证了所提框架相比传统深度学习模型的优越性。 接着,针对非独立组网中低频和毫米波信道角度特征的复杂关联,提出了基于深度学习的低频辅助毫米波波束预测方法。针对输入/输出尺寸的动态性,设计了基于卷积神经网络和无监督学习的鲁棒波束预测模型;进一步地,针对双频段信道测量周期的不一致性,提出了参数共享的级联长短时记忆网络模型,以时域插值的形式预测最优毫米波波束,并通过仿真分析验证了所提模型相比传统低频辅助预测方法的性能优势。 最后,面向更复杂的低频/毫米波异构组网,针对无线环境共享带来的不同基站间复杂信道相关性,提出了双频融合的毫米波基站/波束联合预测方法,采用低频信道信息预测用户特定的高质量毫米波宽波束,进一步利用注意力机制融合双频段特征,预测毫米波最优基站和窄波束。仿真结果表明,所提方法相比传统低频辅助预测方法在相同测量开销下能取得更高的波束增益。

In order to compensate for the high pathloss of millimeter-wave (mmWave) signals, the beamforming technology relying on large antenna arrays has been widely adopted to implement directional transmissions and improve the quality of mmWave communications. Considering the huge number of candidate beams, to achieve high-gain beam alignment with lower overhead has become one of the key issues in mmWave beam management. Conventional beam management methods are difficult to accurately model the complicated nonlinear features in the received signals and user movements, and cannot comprehensively capture the high-dimensional features of surrounding wireless environments, which significantly restrict their performance. Benefiting from the strong fitting capabilities and data-driven adaptive optimization paradigm, this dissertation harnesses deep learning to precisely model nonlinear and high-dimensional features from spatial/temporal/frequency domains in mmWave beam management, targeting to establish low-overhead, efficient and intelligent beam management methodologies. Firstly, considering the nonlinear features caused by power leakage effects in the angular domain, a deep learning based angular-domain super-resolution beam prediction method is proposed to predict the optimal narrow beam according to the received signals of wide beam training. Besides, an adaptive strategy to select high-reliability wide beams is designed. Simulation results verify that the proposed method can significantly reduce the beam training overhead while achieving higher beam gains compared to traditional prediction methods. Secondly, for the temporal nonlinear features in the high-speed mobile scenarios, the deep learning based adaptive low-overhead beam tracking methods are proposed. On the one hand, to adapt to different movement speeds, the tracking beam selection strategy based on the sum-probability criterion is proposed, which flexibly adjusts the tracking range according to prediction confidence, achieving more robust beam alignment compared to traditional tracking methods. On the other hand, considering the arbitrary occurrence of beam misalignment in the temporal domain, the continuous-time beam prediction framework that integrates deep learning with ordinary differential equations is further proposed, and the simulation results demonstrate its superiority over traditional deep learning models. Next, based on the complex correlations of low-frequency and mmWave angular channel characteristics in the non-stand-alone (NSA) scenarios, the low-frequency assisted mmWave beam prediction method based on deep learning is proposed. To handle dynamic input/output sizes, the robust beam prediction model based on convolutional neural networks and unsupervised learning is designed. Furthermore, considering different channel measurement periods between dual-band systems, the cascaded long short-term memory network with shared parameters is proposed to interpolate the optimal mmWave beam. Simulation results validate the performance gains of the proposed models over traditional low-frequency assisted prediction methods. Finally, for the complex low-frequency/mmWave heterogeneous networks, relying on the sophisticated channel correlations between different base stations brought by the shared wireless environment, a dual-band fusion method for joint mmWave base station/beam prediction is proposed, which uses the low-frequency channel to predict user-specific high-quality mmWave wide beams. Additionally, the attention mechanism is utilized to fuse the dual-band features and predict the optimal mmWave base station and narrow beam. Simulation results show that the proposed method achieves higher beam gains under the same measurement overhead compared to traditional low-frequency assisted prediction methods.