m基于图像灰度共生矩阵纹理提取和GRNN神经网络的人口密度检测算法matlab仿真

1.算法仿真效果

matlab2013b仿真结果如下:

 

 

2.算法涉及理论知识概要

       灰度共生矩阵,指的是一种通过研究灰度的空间相关特性来描述纹理的常用方法。 [1]  1973Haralick等人提出了用灰度共生矩阵来描述纹理特征。由于纹理是由灰度分布在空间位置上反复出现而形成的,因而在图像空间中相隔某距离的两像素之间会存在一定的灰度关系,即图像中灰度的空间相关特性。通常计算灰度共生矩阵的方向取0° ,45°,90° , 135°四个方向。若是不对这四个方向综合,则在每一方向上都可以得到多类特征,这样得到纹理特征过于繁多,不利于使用。因而又可以对这四个方向的特征值取平均值,通过比较本文取了四个方向的平均值作为最终的特征值共生矩阵。

 

       纹理特征影像生成的主要思想是:用每一个小窗口形成的子影像,通过纹理特征计算程序计算小窗口影像灰度共生矩阵和纹理特征值,然后将代表这个窗口纹理特征值赋值给窗口的中心点,这就完成了第一小窗口的纹理特征计算。然后窗口被移动一个像素形成另外一个小的窗口影像,再重复计算新共生矩阵和纹理特征值。依次类推,这样整个图象就会形成一个由纹理特征值做成的一个纹理特征值矩阵,然后将这个纹理特征值矩阵转换成纹理特征影像。

 

      广义回归神经网络(Generalized regression neural network, GRNN)是一种建立在非参数核回归基础之上的神经网络,通过观测样本计算自变量和因变量之间的概率密度函数。GRNN结构如图1所示,整个网络包括四层神经元:输入层、模式层、求和层与输出层。

 

 

 

       GRNN神经网络的性能,主要通过对其隐回归单元的核函数的光滑因子来设置的,不同的光滑因子可获得不同的网络性能。

 

       输入层的神经元数目与学习样本中输入向量的维数m相等。每个神经元都分别对应一个不同的学习样本,模式层中第i个神经元的传递函数为:

 

 

 

       由此可以看出,当选择出学习样本之后,GRNN网络的结构与权值都是完全确定的,因而训练GRNN网络要比训练BP网络和RBF网络便捷得多。

 

 

 

 

3.MATLAB核心程序

 

indxx               = 0; 
for tt = frameNum_Originals
    disp('当前帧数');
    tt
    indxx            = indxx + 1; 
    pixel_original   = read(Obj,tt);
    pixel_original2  = imresize(pixel_original,[RR,CC]);
 
    
 
    Temp = zeros(RR,CC,CRGB,'uint8');
 
    Temp = pixel_original2;
    Temp = reshape(Temp,size(Temp,1)*size(Temp,2),size(Temp,3));  
 
    image = Temp;
    for kk = 1:K   
        Datac         = double(Temp)-reshape(Mus(:,kk,:),D,CRGB);
        Squared(:,kk) = sum((Datac.^ 2)./reshape(Sigmas(:,kk,:),D,CRGB),2); 
    end
    [junk,index] = min(Squared,[],2); 
    Gaussian                                                = zeros(size(Squared));
    Gaussian(sub2ind(size(Squared),1:length(index),index')) = ones(D,1);
    Gaussian                                                = Gaussian&(Squared<Deviation_sq);
    %参数更新
    Weights = (1-Alpha).*Weights+Alpha.*Gaussian;
    for kk = 1:K
        pixel_matched   = repmat(Gaussian(:,kk),1,CRGB);
        pixel_unmatched = abs(pixel_matched-1);
        Mu_kk           = reshape(Mus(:,kk,:),D,CRGB);
        Sigma_kk        = reshape(Sigmas(:,kk,:),D,CRGB);
        Mus(:,kk,:)     = pixel_unmatched.*Mu_kk+pixel_matched.*(((1-Rho).*Mu_kk)+(Rho.*double(image)));
        Mu_kk           = reshape(Mus(:,kk,:),D,CRGB); 
        Sigmas(:,kk,:)  = pixel_unmatched.*Sigma_kk+pixel_matched.*(((1-Rho).*Sigma_kk)+repmat((Rho.* sum((double(image)-Mu_kk).^2,2)),1,CRGB));       
    end
    replaced_gaussian   = zeros(D,K); 
    mismatched          = find(sum(Gaussian,2)==0);       
    for ii = 1:length(mismatched)
        [junk,index]                            = min(Weights(mismatched(ii),:)./sqrt(Sigmas(mismatched(ii),:,1)));
        replaced_gaussian(mismatched(ii),index) = 1;
        Mus(mismatched(ii),index,:)             = image(mismatched(ii),:);
        Sigmas(mismatched(ii),index,:)          = ones(1,CRGB)*Variance;
        Weights(mismatched(ii),index)           = Props;  
    end
    Weights         = Weights./repmat(sum(Weights,2),1,K);
    active_gaussian = Gaussian+replaced_gaussian;
    %背景分割 
    [junk,index]    = sort(Weights./sqrt(Sigmas(:,:,1)),2,'descend');
    bg_gauss_good   = index(:,1);
    linear_index    = (index-1)*D+repmat([1:D]',1,K);
    weights_ordered = Weights(linear_index);
    for kk = 1:K
        Weight(:,kk)= sum(weights_ordered(:,1:kk),2);
    end
    bg_gauss(:,2:K) = Weight(:,1:(K-1)) < Back_Thresh;
    bg_gauss(:,1)   = 1;           
    bg_gauss(linear_index)     = bg_gauss;
    active_background_gaussian = active_gaussian & bg_gauss;
    foreground_pixels          = abs(sum(active_background_gaussian,2)-1);
    foreground_map             = reshape(sum(foreground_pixels,2),RR,CC);
    Images1                    = foreground_map;   
    objects_map                = zeros(size(foreground_map),'int32');
    object_sizes               = [];
    Obj_pos                    = [];
    new_label                  = 1;

 

  

 

posted @ 2023-05-12 19:53  我爱C编程  阅读(37)  评论(0编辑  收藏  举报