ニューラルネットワークのプログラムの翻訳
初めて質問させて頂きます。
大学の教授に、MATLABのプログラムを配られました。
そのプログラムがどういう処理を行って、何をしているのかを、何も知識のない学生に向けて発表しなければいけません。
しかし、自分もほとんど知識がなく、具体的に配られたプログラムがどういう処理を行っているのか、よくわかりません。ニューラルネットワークを使って何かをしているのはわかるのですが。
どなたかこのプログラムの意味を、何もわからない自分でもわかるように教えて頂けますでしょうか。
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%% Neural Network for Classification %%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%% clear, clf, and close %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
clear;
clf;
close all hidden;
%%%%%%%%%%%%%% True Classfication Rule %%%%%%%%%%%%%%%%%%%%%%%%%%%%
true_rule=@(x1,x2)(x1.^2+x2.^2-2);
%%%%%%%%%%%%%%%%%%%%% Generate Training Data %%%%%%%%%%%%%%%%%%%%%
n=50; %%%%% Number of Training samples
xdata=-4*rand(2,n)+2;
ydata(1,:)=0.01+0.98*(sign(true_rule(xdata(1,:),xdata(2,:)))+1)/2;
%%%%%%%%%%%%%%%%%%%%% Draw Training Samples %%%%%%%%%%%%%%%%%%%%
for i=1:1:n
if(ydata(i)>0.5)
plot(xdata(1,i),xdata(2,i),'ro'); hold on;
else
plot(xdata(1,i),xdata(2,i),'b*'); hold on;
end
end
xlim([-2,2]);
ylim([-2,2]);
drawnow;
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%% Begin Neural Network Learning %%%%%%%%%%%%
neuron=@(u,ph,h)(1./(1+exp(-(u*h+ph))));
%%%%%%%%%%%%%%%%%%%%%%%% Hyperparameters %%%%%%%%%%%%%
HYPERPARAMETER=0.0001;
diffhyper=@(a)(HYPERPARAMETER*a);
%%%%%%%%%%%%%%%%%%%%%%% Training Conditions %%%%%%%%%%%%%%%%%
CYCLE=2000; %%% training cycles
N=1; %%% output Units
H=8; %%% hidden Units
M=2; %%% input Units
ETA=0.8; %%% gradient constant
ALPHA=0.3; %%% accelerator
EPSILON=0.01; %%% regularization
%%%%%%%%%%%%%%%%%%%%%% Training Initialization %%%%%%%%
u=0.1*randn(N,H); %%% weight from hidden to output
w=0.1*randn(H,M); %%% weight from input to hidden
ph=0.1*randn(N,1); %%% bias of output
th=0.1*randn(H,1); %%% bias of hidden
du=zeros(N,H); %%% gradient weight from hidden to output
dw=zeros(H,M); %%% gradient weight from input to hidden
dph=zeros(N,1); %%% gradient bias of output
dth=zeros(H,1); %%% gradient bias of hidden
%%%%%%%%%%%%%%%%%%%% Backpropagation Learning %%%%%%%%%%%%
for cycle=1:1:CYCLE
for i=1:1:n
x=xdata(:,i);
t=ydata(:,i);
h=neuron(w,th,x);
o=neuron(u,ph,h);
%%%%%%%%%%%%%%%%%% delta calculation %%%%%%%%%%%%
delta1=(o-t).*(o.*(1-o)+EPSILON);
delta2=(delta1'*u)'.*(h.*(1-h)+EPSILON);
%%%%%%%%%%%%%%%%%% gradient %%%%%%%%%%%
du=delta1*h'+ALPHA*du;
dph=delta1+ALPHA*dph;
dw=delta2*x'+ALPHA*dw;
dth=delta2+ALPHA*dth;
%%%%%%%%%%%%%%%%%%% steepest descent %%%%%%%%%%
u=u-ETA*du-diffhyper(u);
ph=ph-ETA*dph;
w=w-ETA*dw-diffhyper(w);
th=th-ETA*dth;
end
end
%%%%%%%%%% End Neural Network Learning %%%%%%%%%%%%%%%%
%%%%%%%%%% Draw Trained Results %%%%%%%%%%%%%%%%
for j=1:1:41
for k=1:1:41
xxx(1,1)=-2+(j-1)/10;
xxx(2,1)=-2+(k-1)/10;
testx1(j,k)=-2+(j-1)/10;
testx2(j,k)=-2+(k-1)/10;
h=neuron(w,th,xxx);
testy(j,k)=neuron(u,ph,h);
end
end
contour(testx1,testx2,testy,5);
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%