Description
In conventional radar signal processing, a structured model for the target response is used, while clutter and interference are characterized by the covariance matrix of the data distribution. In contrast, the channel matrix-based model represents both target and clutter returns

In conventional radar signal processing, a structured model for the target response is used, while clutter and interference are characterized by the covariance matrix of the data distribution. In contrast, the channel matrix-based model represents both target and clutter returns as responses to corresponding channels, resulting in a more versatile model that can incorporate various scenarios. Optimal receive architectures for target detection within a channel matrix-based model are explored using likelihood ratio tests (LRT) and average LRT (ALRT) tests. Generalized likelihood ratio test (GLRT) statistics are derived for the channel matrix-based MIMO radar data model under the assumption of complex multivariate elliptically symmetric (CMES) data distribution, considering both known and unknown covariance matrices of the waveform-independent colored noise (WICN). For the known covariance case, the GLRT statistic follows a chi-square distribution, while for the unknown covariance case, it aligns with Wilks' lambda distribution. The GLRT statistic for the known WICN covariance case, when the maximum likelihood estimate of the covariance matrix replaces the true covariance matrix, matches the Bartlett-Nanda-Pillai trace statistic under the null hypothesis and follows a non-central Lawley-Hotelling $T_0^2$ distribution under the alternative hypothesis. Asymptotically, all derived statistics converge to the known covariance case. Monte Carlo simulations and the saddle point approximation method are employed to generate receiver operating characteristic (ROC) curves for a simple numerical example, supplemented by experimental results and high-fidelity simulations. The potential of deep learning techniques for radar target detection is investigated, with a proposed deep neural network (DNN) architecture benefiting from both model-based and data-driven approaches. The asymptotic distribution of the GLRT statistic for adaptive target detection is non-central chi-squared with a non-centrality parameter that depends on the waveform information. This provides a basis for the design of optimal waveforms fortarget detection. The waveform optimization problem is formulated as a semidefinite programming instance, and an algorithm is proposed to maximize the non-centrality parameter, thereby enhancing the probability of target detection. This algorithm also incorporates power and peak-to-average power ratio (PAPR) constraints, essential for ensuring practical and efficient radar operation.
Reuse Permissions
  • Downloads
    PDF (29.4 MB)

    Details

    Title
    • Channel Matrix-based Cognitive Framework for Adaptive Radar
    Contributors
    Date Created
    2024
    Resource Type
  • Text
  • Collections this item is in
    Note
    • Partial requirement for: Ph.D., Arizona State University, 2024
    • Field of study: Electrical Engineering

    Machine-readable links