'ENTROPY' calculates the entropy of an isolated matrix referring it to its own statistics, i.e. account all the different values (Xi) and the number of occurrences (ni) and calculate the probability of each xi as Pi = ni / N. At the end makes the corresponding summation: _ | Entropy = | P (x) log2 (P (x)) dx | - USE: [e] = entropia(a);
0001 % 'ENTROPY' calculates the entropy of an isolated matrix referring it 0002 % to its own statistics, i.e. account all the different values 0003 % (Xi) and the number of occurrences (ni) and calculate the probability 0004 % of each xi as Pi = ni / N. 0005 % At the end makes the corresponding summation: 0006 % _ 0007 % | 0008 % Entropy = | P (x) log2 (P (x)) dx 0009 % | 0010 % - 0011 % 0012 % USE: [e] = entropia(a); 0013 0014 function [e,ta]=entropia(a) 0015 0016 n=size(a); 0017 k=1; 0018 0019 for i=1:n(1) 0020 for j=1:n(2) 0021 dato=a(i,j); 0022 if (i==1)&(j==1) 0023 ta=[1 dato 1]; 0024 else 0025 valores=ta(:,2); 0026 condicion=sum(((valores-dato)==0)); 0027 if condicion>0 0028 indice=find((valores-dato)==0); 0029 ta(indice,3)=ta(indice,3)+1; 0030 0031 else 0032 k=k+1; 0033 ta(k,:)=[k dato 1]; 0034 0035 end 0036 end 0037 end 0038 end 0039 0040 P=ta(:,3)/(n(1)*n(2)); 0041 LP=-log(P)/log(2); 0042 0043 e=sum(P.*LP); 0044