﻿ 基于变分稀疏贝叶斯学习的DOA估计
«上一篇
 文章快速检索 高级检索

 应用科技  2018, Vol. 45 Issue (6): 32-36  DOI: 10.11991/yykj.201712017 0

### 引用本文

GAO Lipeng, DU Xuhua. Direction-of-arrival (DOA) estimation based on variational sparse Bayesian learning[J]. Applied Science and Technology, 2018, 45(6), 32-36. DOI: 10.11991/yykj.201712017.

### 文章历史

Direction-of-arrival (DOA) estimation based on variational sparse Bayesian learning
GAO Lipeng, DU Xuhua
College of Information and Communication Engineering, Harbin Engineering University, Harbin 150001, China
Abstract: To solve the problems of high complexity and slow convergence rate of traditional sparse Bayesian learning(SBL) algorithm, this paper proposes a direction-of-arrival (DOA) estimation algorithm based on variational sparse Bayesian learning(VSBL). Firstly, a DOA estimation signal model based on sparse representation was established by spatial gridding. Secondly, based on this model, a priori distribution was specified for unknown parameters to be estimated, then obtain the posterior probability distribution of sparse signal. Then apply the variational Bayesian learning algorithm to find the approximate distribution of the posterior probability distribution by minimizing the KL divergence. Finally, estimate the unknown parameters, and obtain the DOA estimation value of the signal. According to the MATLAB simulation results, the signal DOA was estimated successfully by the algorithm, achieving the expected results. Compared with traditional sparse Bayesian learning algorithm, this algorithm has higher DOA estimation accuracy and faster convergence speed under single snapshot.
Keywords: DOA estimation    Bayesian learning    variational Bayesian learning    sparse representation    correlation vector machine    MATLAB simulation    estimation accuracy    convergence speed

1 基于稀疏表示的DOA估计模型

 ${{y}}(t) = {{As}}(t) + {{n}}(t)$

 ${{x}}(t) = {[{x_1}(t),{x_2}(t), \cdots ,{x_N}(t)]^{\rm{T}}}$

${{x}}(t)$ 中只有 $K$ 个位置的元素是非零的，对应着空间中实际存在的信号 $s(t)$ ，其余 $N - K$ 个位置的元素为零。则稀疏化后的空间信号矢量对应的 $M \times N$ 维阵列流型矩阵 ${{\varPhi}}$

 ${{\varPhi}} {\rm{ = }}[{{a}}({\theta _1}),{{a}}({\theta _2}), \cdots ,{{a}}({\theta _N})]$

 ${{y}}(t) = {{\varPhi}} {{x}}(t) + {{n}}(t)$ (1)

 $\left\{ {\begin{array}{*{20}{l}}{\hat { x} = \arg \min {{\left\| { x} \right\|}_1}}\\{{\rm s.t.}{{\left\| {{ y} - { {Ax}}} \right\|}_2}\sigma }\end{array}} \right.$

 $\left( {\begin{array}{*{20}{c}} {{\rm Re}({{y}})} \\ {\rm{Im}} ({{y}})\end{array}} \right) = \left( {\begin{array}{*{20}{c}} {{\rm Re}({{\varPhi}} )}&{ - \rm{Im} ({{\varPhi}} )} \\ {\rm{Im} ({{\varPhi}} )}&{R({{\varPhi}} )} \end{array}} \right)\left( {\begin{array}{*{20}{c}} {{\rm Re}({{x}})} \\ {\rm{Im}} ({{x}}) \end{array}} \right) + \left( {\begin{array}{*{20}{c}} {{\rm Re}({{n}})} \\ {\rm{Im}} ({{n}}) \end{array}} \right)$

2 基于稀疏变分贝叶斯学习的DOA估计算法 2.1 稀疏贝叶斯模型

 ${{y}} = {{\varPhi}} {{x}} + {{n}}$

 $\begin{gathered} p({{y}}\left| {{x}} \right.,{\sigma ^2}) = \prod\limits_{n = 1}^N {p({{{y}}_n}\left| {{{{\varPhi}} _n}} \right.{{x}},{\sigma ^2})} {\rm{ = }} \hfill \\\quad\quad\quad\quad\quad{\rm{ (2{\text π}}}{\sigma ^2}{{\rm{)}}^{ - \frac{N}{2}}}\exp {( - \frac{1}{{2{\sigma ^2}}}\left\| {{{y}} - {{\varPhi}} {{x}}} \right\|)^2} \hfill \\ \end{gathered}$

 $p({{x}}\left| {{\alpha}} \right.) = \prod\limits_{i = 1}^N {N({x_i}\left| 0 \right.,{\alpha _i}^{ - 1})}$

 $p({{\alpha}} ) = \prod\limits_{i = 1}^N {\varGamma ({\alpha _i}\left| a \right.,b)}$
 $p({\alpha _0}) = \varGamma ({\alpha _0}\left| c \right.,d)$

 $\varGamma (\xi \left| {a,b} \right.) = \frac{{{b^a}}}{{\varGamma (a)}}{\xi ^{a - 1}}\exp ( - b\xi )$

 $p({{x}},{{\alpha}} ,{\alpha _0}|{{y}}) = \frac{{p({{y}}|{{x}},{{\alpha}} ,{\alpha _0})p({{x}},{{\alpha}} ,{\alpha _0})}}{{p({{y}})}}$
 $p({{y}}) = \iiint {p({{y}}|{{x}},{{\alpha}} ,{\alpha _0})}p({{x}},{{\alpha}} ,{\alpha _0}){\rm{d}}{{x}}{\rm{d}}{{\alpha}} {\rm{d}}{\alpha _0}$

$p(y)$ 的计算通常要通过高维、复杂的积分，是不易求解的。稀疏贝叶斯模型下通过分解方法求待估计参数最大后验概率分布，即

 $p({{x}},{{\alpha}} ,{\alpha _0}|{{y}}) = p({{x}}|{{y}},{{\alpha}} ,{\alpha _0})p({{\alpha}} ,{\alpha _0}|{{y}})$

2.2 变分稀疏贝叶斯估计

 ${{\vartheta}} = \left\{ {{{{\vartheta}} _1},{{{\vartheta}} _2},{\vartheta _3}} \right\} = \left\{ {{{x}},{{\alpha}} ,{\alpha _0}} \right\}$

 $p({{y}}){\rm{ = }}\frac{{p({{y}},{{\vartheta}} )}}{{p({{x}},{{\vartheta}} |{{y}})}}$ (2)

 $p({{y}}){\rm{ = }}{{\frac{{p({{y}},{{\vartheta}} )}}{{q({{\vartheta}} )}}} / {\frac{{p({{{{\vartheta}}}} |{{y}})}}{{q({{\vartheta}} )}}}}$ (3)

 $\ln (p({{y}})) = \ln \frac{{p({{y}},{{\vartheta}} )}}{{q({{\vartheta}} )}} - \ln \frac{{p({{\vartheta}} |{{y}})}}{{q({{\vartheta}} )}}$ (4)

$\iiint {q({{\vartheta}} )d{{{\vartheta}} _1}d{{{\vartheta}} _2}d{\vartheta _3}} = 1$ ，式(4)可转换为

 $\begin{gathered} \ln (p({{y}})) = \iiint {q({{\vartheta}} )\ln \frac{{p(y,{{\vartheta}} )}}{{q({{\vartheta}} )}}}{\rm{d}}{{{\vartheta}} _1}{\rm{d}}{{{\vartheta}} _2}{\rm{d}}{\vartheta _3} - \hfill \\ \quad\quad\quad\quad{\rm{ }}\iiint {q({{\vartheta}} )\ln \frac{{p({{\vartheta}} |{{y}})}}{{q({{\vartheta}} )}}}{\rm{d}}{{{\vartheta}} _1}{\rm{d}}{{{\vartheta}} _2}{\rm{d}}{\vartheta _3} \hfill \\ \end{gathered}$ (5)

 $\ln (p({{y}})){\rm{ = }}L(q({{\vartheta}} )) + {\rm{KL}}(q({{\vartheta}} )||p({{\vartheta}} |{{y}}))$
 $L(q({{\vartheta}} )){\rm{ = }}\iiint {q({{\vartheta}} )\ln \frac{{p({{y}},{{\vartheta}} )}}{{q({{\vartheta}} )}}}{\rm d}{{{\vartheta}} _1}{\rm d}{{{\vartheta}} _2}{\rm d}{\vartheta _3}$
 ${\rm{KL}}(q({{\vartheta}} )||p({{\vartheta}} |{{y}})){\rm{ = }} - \iiint {q({{\vartheta}} )\ln \frac{{p({{\vartheta}} |{{y}})}}{{q({{\vartheta}} )}}}{\rm{d}}{{{\vartheta}} _1}{\rm{d}}{{{\vartheta}} _2}{\rm{d}}{\vartheta _3}$

 $q({{\vartheta}} ) = q({{x}},{{\alpha}} ,{\alpha _0}){\rm{ = }}q({{x}})q({{\alpha}} )q({\alpha _0})$

 $q({{x}}){\rm{ = }}N({{x}}|\overline {{u}} ,\overline {{\varSigma}} )$
 $q({{\alpha}} ){\rm{ = }}\sum\limits_{m = 1}^N {\varGamma ({\alpha _m}|{{\overline a }_m},{{\overline b }_m})}$
 $q({\alpha _0}){\rm{ = }}\varGamma ({\alpha _0}|\overline c ,\overline d )$

 $\overline {{u}} = {\alpha _0}\overline {{\varSigma}} {{{\varPhi}} ^{\rm{T}}}{{y}}$ (6)
 $\overline {{\varSigma}} {\rm{ = (diag}}({\alpha _m}) + {\alpha _0}{{{\varPhi}} ^{\rm{T}}}{{\varPhi}} {)^{ - 1}}$ (7)
 ${\overline a_m} = a + 1/2$ (8)
 ${\overline b_m} = b + \frac{{|\overline {{u}} {|^2} + {{\bar {{\varSigma}} }_{mm}}}}{2}$ (9)
 $\overline c = c + (N + 1)/2$ (10)
 $\overline d = d + \frac{{||{{y}} - {{\varPhi}} \overline {{\mu}} |{|^2} + {\rm{tr}}(\overline {{\varSigma}} {{{\varPhi}} ^{\rm{T}}}{{\varPhi}} )}}{2}$ (11)
 ${\alpha _m} = {\overline a_m}/{\overline b_m}$
 ${\alpha _0} = \overline c/\overline d$

 $\begin{gathered} L(q({{x}},{{\alpha}} ,{\alpha _0})) = \left\langle {\ln p({{y}}|{{x}},{\alpha _0})} \right\rangle + \left\langle {\ln p({{x}}|{{\alpha}} )} \right\rangle + \hfill \\ \quad\quad\quad\quad\quad\quad\;\,\left\langle {\ln p({{\alpha}} )} \right\rangle + \left\langle {\ln p({\alpha _0})} \right\rangle - \left\langle {\ln q({{x}})} \right\rangle - \hfill \\ \quad\quad\quad\quad\quad\quad\;\,\left\langle {\ln q({{\alpha}} )} \right\rangle - \left\langle {\ln q({\alpha _0})} \right\rangle \hfill \\ \end{gathered}$

 $\begin{gathered} L(q({{x}},{{\alpha}} ,{\alpha _0})){\rm{ = }}\frac{1}{2}\ln |\overline {{\varSigma}} | - {\overline a _m}\sum\limits_{m = 1}^M {\ln {{\overline b }_m}} + \hfill \\ \quad\quad\quad\quad\quad\;\;\;\;{\rm{ }}\frac{1}{2}\sum\limits_{m = 0}^{N - 1} {\ln {{\overline {{\varSigma}} }_{mm}}} - \overline c \ln \overline d + {L_{{\rm{const}}}} \hfill \\ \end{gathered}$ (12)
 $\begin{gathered} {L_{{\rm{const}}}} = - \frac{N}{2}\ln 2{\rm{{\text π}}} + \frac{{M + N}}{2} - \hfill \\ \quad\quad\;\;\;{\rm{ }}(M + N + 1)\ln \varGamma (a) + (M + N + 1)a\ln b + \hfill \\ \quad\quad\;\;\;{\rm{ }}(M + N)\ln \varGamma ({{\overline a}_m}) + \ln \varGamma (\overline c) \hfill \\ \end{gathered}$

1)初始化 $\overline {{u}}$ $\overline {{\varSigma}}$ ${\overline a_m}$ ${\overline b_m}$ $\overline c$ $\overline d$

2)利用式(6)和(7)更新 $\overline {{u}}$ $\overline {{\varSigma}}$

3)利用式(8)~(11)更新 ${\overline a_m}$ ${\overline b_m}$ $\overline c$ $\overline d$

4)利用式(12)更新 ${L_t}(q({{x}},{{\alpha}} ,{\alpha _0})$ $t$ 为当前更新次数；

5)如果： $\displaystyle\frac{{{L_t}(q({{x}},{{\alpha}} ,{\alpha _0})) - {L_{t - 1}}(q({{x}},{{\alpha}} ,{\alpha _0}))}}{{{L_{t - 1}}(q({{x}},{{\alpha}} ,{\alpha _0}))}} < \sigma$ ，则停止迭代，输出重构信号 ${{x}}$ ，否则重返步骤2)，进行下一次迭代。

3 仿真结果

3.1 不同信噪比下的DOA估计结果比较

3.2 不同阵元数下的DOA估计结果比较

3.3 算法运行时间比较

4 结论

1)本文算法在低信噪比下具有更高的DOA估计精度以及成功率，更有利于在复杂电磁环境下DOA估计的应用；

2)本文算法可通过更少的振元数高精度、高成功率估计出信号到达角，从而减轻硬件对大量数据存储、传输和处理的压力；

3)本文算法具有更低的算法复杂度，大幅度减少了算法的运行时间，更有利于DOA估计实时性要求的实现；

4)本文算法只适用于单快拍下的DOA估计问题，将该思想引用到多快拍下的DOA估计问题是下一步的研究方向。

 [1] SCHMIDT R. Multiple emitter location and signal parameter estimation[J]. IEEE transactions on antennas and propagation, 1986, 34(3): 276-280. DOI:10.1109/TAP.1986.1143830 (0) [2] ROY R, KAILATH T. ESPRIT-estimation of signal parameters via rotational invariance techniques[J]. IEEE transactions on acoustics, speech, and signal processing, 2002, 37(7): 984-995. (0) [3] 李鹏飞, 张旻, 钟子发. 基于稀疏表示的宽带DOA估计[J]. 电子测量与仪器学报, 2011, 25(8): 716-721. (0) [4] 赵永红, 张林让, 刘楠, 等. 一种新的基于稀疏表示的宽带信号DOA估计方法[J]. 电子与信息学报, 2015, 37(12): 2935-2940. (0) [5] 燕学智, 温艳鑫, 刘国红, 等. 基于稀疏表示和近似范数约束的宽带信号DOA估计[J]. 航空学报, 2017, 38(6): 320705. (0) [6] 李鹏飞, 张旻, 钟子发. 基于空间角稀疏表示的二维DOA估计[J]. 电子与信息学报, 2011, 33(10): 2402-2406. (0) [7] 刘永花, 周围. 基于协方差矩阵稀疏表示的相干源DOA估计算法[J]. 电子世界, 2017(6): 118-120. DOI:10.3969/j.issn.1003-0522.2017.06.087 (0) [8] 庞慧, 陈俊丽. 基于稀疏贝叶斯的二维DOA估计算法研究[J]. 工业控制计算机, 2017, 30(10): 90-91, 94. DOI:10.3969/j.issn.1001-182X.2017.10.039 (0) [9] YANG Jie, LIAO Guisheng, LI Jun. An efficient off-grid DOA estimation approach for nested array signal processing by using sparse Bayesian learning strategies[J]. Signal processing, 2016, 128: 110-122. DOI:10.1016/j.sigpro.2016.03.024 (0) [10] SI Weijian, QU Xinggen, QU Zhiyu, et al. Off-grid DOA estimation via real-valued sparse bayesian method in compressed sensing[J]. Circuits, systems, and signal processing, 2016, 35(10): 3793-3809. DOI:10.1007/s00034-015-0221-3 (0) [11] 高阳, 陈俊丽, 杨广立. 基于酉变换和稀疏贝叶斯学习的离格DOA估计[J]. 通信学报, 2017, 38(6): 177-182. (0) [12] 孙磊, 王华力, 许广杰, 等. 基于稀疏贝叶斯学习的高效DOA估计方法[J]. 电子与信息学报, 2013, 35(5): 1196-1201. (0) [13] BISHOP C M, TIPPING M. Variational relevance vector machines[C]//Proceedings of the 16th Conference on Uncertainty in Artificial Intelligence. San Francisco, USA, 2000: 46−53. (0)