船舶作为海上的重要交通工具以及军事目标,对其的监视十分重要。在可见光遥感图像领域,对于船舶的检测也是一个重要的研究内容。但随着隐身技术手段的发展,对于船舶的直接检测会面临越来越多的挑战。然而,尾迹作为船舶运动的必然产物,其长度可达船长的数倍,且持续时间可达十几分钟。尾迹的检测不仅可以辅助定位目标较小或自带隐身技术的船舶位置,其形态细节还可以进一步反演出船舶的航向、速度等重要运动参数。为此,本文在可见光遥感图像的基础上,对船舶尾迹的检测分类以及船舶参数反演方面开展了研究。首先,本文通过文献调研,明确了船舶尾迹的分类、形态特征,并研究了尾迹建模的数学模型。同时,针对可见光遥感领域高质量的尾迹数据集不足的问题,本文从遥感卫星平台、无人机航拍船模平台两个方面,构建了一套高分辨率、含有复杂海洋背景、标注比例尺等图例信息的可见光图像尾迹数据库。在尾迹的检测方面,针对文献中检测算法准确率偏低、实时性不足的问题,本文在前文建立的航拍船模数据库的优势基础上,开发了基于神经网络的轻量化尾迹检测算法。在算法中通过深度可分离卷积与SE(Squeeze and Excitation Networks)注意力机制的应用,实现了95%以上的平均检测精度目标与单张图片检测时间小于30 ms的实时性目标。同时模型空间占用小于50 Mb,实现了轻量化的目标。这也为未来算法的机载平台硬件部署建立了基础。在尾迹的分类方面,本文通过尾迹的数学模型建立了仿真平台,研究了船速与船长对尾迹形态的影响。同时,结合前文建立的数据库平台,从仿真结论与数据库统计结果两个方面,建立了尾迹的三分类依据:快速小型船尾迹、快速大型船尾迹与慢速大型船尾迹。最终,基于前文的检测模型,本文通过迁移学习完成了新的分类任务。最终在65张图片的测试集中,三种类型尾迹的分类平均精度均达到了90%以上。最后,在尾迹特征反演船舶运动参数方面,本文分别开发了基于像素分析的湍流尾迹检测与航向反演算法,同时开发了基于开尔文尾迹结构分析的分歧波检测与速度反演算法。最终在测试实验中,航向反演平均误差为1.99%,速度反演平均误差为2.42%。
Ship detection and monitoring is essential due to the importance of ships as transportation vehicles and military targets at sea. In the visible remote sensing image field, ship detection is also a crucial research topic. However, with the development of stealth technology, direct ship detection faces increasing challenges. Nevertheless, wakes, as a necessary product of ship movement, can be several times the length of the ship and last for more than ten minutes. Wake detection not only assists in locating small targets or ships with built-in stealth technology but also can further infer important motion parameters such as the heading and speed of the ship from the detailed wake morphology. Therefore, this thesis conducts research on wake detection and classification as well as ship parameter inversion based on visible remote sensing images.Firstly, through literature research, this thesis clarifies the classification and morphological characteristics of ship wakes and studies the mathematical models for wake modeling. Meanwhile, to address the issue of insufficient high-quality wake datasets in the visible remote sensing field, this thesis constructs a high-resolution visible image wake database that includes complex ocean backgrounds, annotated scale information, and all the samples are collected from both remote sensing satellite platforms and unmanned aerial vehicle platforms.In terms of wake detection, in view of the low accuracy and insufficient real-time performance of detection algorithms in the literature, this thesis develops a neural network-based lightweight wake detection algorithm, taking advantage of the aerial photography ship model database established above. Through the application of depth-separable convolution and SE attention mechanism in the algorithm, the average detection accuracy target of more than 90% and the real-time target of a single picture detection time less than 30 ms are achieved. At the same time, the model space occupies less than 50 Mb, achieving the goal of lightweight. This also establishes the basis for the hardware deployment of future algorithms on airborne platforms.In terms of wake classification, this thesis establishes a simulation platform based on the mathematical model of wakes and studies the influence of ship speed and length on wake morphology. Combining the simulation results and the database statistical results, this thesis establishes the three classification criteria for wakes: fast small ship wakes, fast large ship wakes, and slow large ship wakes. Finally, based on the wake detection model established earlier, this thesis completes the new classification task through transfer learning. In the test set of 65 images, the average accuracy of the three types of wake classification reaches over 90%.Lastly, in the wake feature inversion and ship motion parameter estimation, this thesis develops both turbulence wake detection and heading inversion algorithms based on pixel analysis and a Kelvin wake structure analysis-based divergence wave detection and speed inversion algorithm. In the experimental tests, the average errors for heading inversion and speed inversion are 1.99% and 2.42%, respectively.