# IEEE Trans 2008 Gradient Pursuits论文学习

```function[theta]=CS_GP(y,A,t)
%CS_GP Summary of this function goes here
%Version: 1.0 written by wwf @2017-04-28
%   Detailed explanation goes here
%   y = Phi * x
%   x = Psi * theta
%   y = Phi*Psi * theta
%   令 A = Phi*Psi, 则y=A*theta
%   现在已知y和A，求theta
[y_rows,y_columns]= size(y);
if y_rows<y_columns
y = y';%y should be a column vector
end
[M,N]= size(A);                            %传感矩阵A为M*N矩阵
theta = zeros(N,1);                         %用来存储恢复的theta(列向量)
aug_y=[];
r_n = y;                                    %初始化残差(residual)为y
Aug_t=[];
for ii =1:t                                  %迭代t次，t为输入参数
for col=1:N;
product(col)=abs(A(:,col)'*r_n);             %传感矩阵A各列与残差的内积
end
[val,pos]= max(abs(product));        %找到最大内积绝对值，即与残差最相关的列
Aug_t=[Aug_t,A(:,pos)];
pos_array(ii)=pos;
g_n=Aug_t'*r_n;                               %  梯度方向
c_n=Aug_t*g_n;
w_n=(r_n'*c_n)/(c_n'*c_n);                   %  最速下降步长
d_n=w_n*g_n;
[x1,x2]=size(d_n);
[y1,y2]=size(aug_y);
D=aug_y;
aug_y=zeros(x1,x2);
aug_y(1:y1,1:y2)=D;
aug_y=aug_y+d_n ;                                %  最小二乘,使残差最小
r_n=r_n-(w_n)*(c_n);                               %  残差
end
theta(pos_array)=aug_y;
end```

```function[theta]=GP_test(y,A,t)
%CS_GP Summary of this function goes here
%Version: 1.0 written by wwf @2017-04-28
%   Detailed explanation goes here
%   y = Phi * x
%   x = Psi * theta
%   y = Phi*Psi * theta
%   令 A = Phi*Psi, 则y=A*theta
%   现在已知y和A，求theta
[y_rows,y_columns]= size(y);
if y_rows<y_columns
y = y';%y should be a column vector
end
[M,N]= size(A);                            %传感矩阵A为M*N矩阵
theta = zeros(N,1);                         %用来存储恢复的theta(列向量)
r_n = y;                                    %初始化残差(residual)为y
d_n=zeros(N,1);
P =@(z) A*z;
Pt =@(z) A'*z;
IN=[];

for ii =1:50                                  %迭代t次，t为输入参数
product=Pt(r_n);
[v I]=max(abs(product));

if isempty(find (IN==I))
IN=[IN I];
else
break;
end
d_n(IN)=product(IN);
c_n=P(d_n);
a_n=r_n'*c_n/(c_n'*c_n);
theta=theta+a_n*d_n;
r_n=r_n-a_n*c_n;
end
end```

CGP（代码名称为greed_omp_cgp）

```while~done
DR(IN)=0;
[v I]=max(abs(DR));
IN=[IN I];
k=k+1;
ifk==1
d(IN)=1;
PG(1)=1;
else
%%%% Calculate P'*G, but only need new column and new row %%%%%
gnew=Pt(new_element);%gnew相当于G，即Psi'*Psi
PG(k-1,k-1)=D(1:k,k-1)'*[g;1];
g=gnew(IN);
%PG计算的是D’*G
PG(:,k)=D'*[g;zeros(maxM-k,1)]; % 1 general mult.
%%%% Calculate conjugate directions %%%
b=(PG(1:k-1,1:k)*DR(IN))./(dPPd(1:k-1));
d(IN)=DR(IN)-D(1:k,1:k-1)*b;%d should be orthogonal to the first k-1 columns of G.
end
D(1:k,k)=d(IN);%D是由n-1次的更新方向组成的矩阵
Pd=P(d);
dPPd(k)=Pd'*Pd;
a=(DR'*d)/dPPd(k);
s=s+a*d;
Residual=Residual-a*Pd;
DR=Pt(Residual);

ERR=Residual'*Residual/n;
ifcomp_err
err_mse(iter)=ERR;
end

ifcomp_time
iter_time(iter)=toc;
end```

ACGP（代码名称为greed_nomp）

```tic
t=0;
p=zeros(m,1);
DR=Pt(Residual);
[v I]=max(abs(DR));
ifweakness~=1
[vals inds]=sort(abs(DR),'descend');
I=inds(find(vals>=alpha*v));
end

IN=union(IN,I);
ifstrcmp(STOPCRIT,'M')&length(IN)>=STOPTOL
IN=IN(1:STOPTOL);
end
pDDp=1;
done=0;
iter=1;
while~done

% Select new element

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%  Iteration to automatic selection of the number of gradient steps
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%             finished=0;
%             while ~finished
% Update direction
ifiter==1
p(IN)=DR(IN);   %p相当于论文中的d_n,当迭代次数为1时，d_n等于内积
Dp=P(p);        %Dp相当于论文中的c_n，即Psi与d_n的乘积
else
b=-Dp'*PDR/pDDp;%计算系数b1
p(IN)=DR(IN)+b*p(IN);%计算更新的方向d_n
Dp=PDR+b*Dp;  %c_n是Psi与d_n-1的乘积，即P（d_n-1），将d_n-1展开带入即得
end
% Step size
%                  Dp=P(p); % =P(DR(IN)) +b P(p(IN));
pDDp=Dp'*Dp;
a=Residual'*Dp/(pDDp);
% Update coefficients
s=s+a*p;
% New Residual and inner products
Residual=Residual-a*Dp;
DR=Pt(Residual);
% select new element
[v I]=max(abs(DR));
ifweakness~=1
[vals inds]=sort(abs(DR),'descend');
I=inds(find(vals>=alpha*v));
end
IN=union(IN,I);
ifstrcmp(STOPCRIT,'M')&length(IN)>=STOPTOL
IN=IN(1:STOPTOL);
end
%                  % Only if we select new element do we leave the loop
%                      if isempty(find (IN==I, 1))
%                         IN=[IN I];
%                         finished=1;
%                      end
%             end```

[1] Blumensath  T,  Davies  M.  Gradient  pursuits[J].  IEEE  Transactions  on  Signal  Processing,  2008, 56(6):2370-2382.

[2] 刘盼盼.压缩感知中梯度追踪算法的研究[D].南京：南京邮电大学，2015：7-21

66 篇文章35 人订阅

0 条评论

## 相关文章

6561

2705

1144

1356

5786

7379

### 扩展欧几里得算法

有两个数 a b，现在，我们要求 a b 的最大公约数，怎么求？枚举他们的因子？不现实，当 a b 很大的时候，枚举显得那么的naïve ，那怎么做？

4813

### 如何快速优化机器学习的模型参数

【导读】一般来说机器学习模型的优化没什么捷径可循。用什么架构，选择什么优化算法和参数既取决于我们对数据集的理解，也要不断地试错和修正。所以快速构建和测试模型的能...

1212

2.8K1

38910