【MATLAB深度学习】神经网络与分类问题
神经网络与分类问题
1.多元分类
根据分类的数量确定输出节点的数量是最可能得到良好效果的方法。输出的类别表示可以使用one-hot编码。通常情况下,二分类使用Sigmoid函数,多元分类使用Softmax函数。Softmax函数不仅考虑输入的加权和,而且考虑其他输出节点的输出。正确地诠释神经网络多元分类的输出结果需要考虑所有节点输出的相对大小。Softmax函数保证输出值之和为1。其也适用于二分类。
多元分类程序示例,输入数据为5个5*5矩阵,分别表示1,2,3,4,5。网络结构为输入节点25个,输出节点5个,隐含节点50个。代码如下:
function [W1, W2] = MultiClass(W1, W2, X, D)
alpha = 0.9;
N = 5;
for k = 1:N
x = reshape(X(:, :, k), 25, 1); % k表示第k幅图,25*1向量
d = D(k, :)';
v1 = W1*x;
y1 = Sigmoid(v1);
v = W2*y1;
y = Softmax(v);
e = d - y;
delta = e;
e1 = W2'*delta;
delta1 = y1.*(1-y1).*e1;
dW1 = alpha*delta1*x';
W1 = W1 + dW1;
dW2 = alpha*delta*y1';
W2 = W2 + dW2;
end
end
Softmax函数定义如下:
function y = Softmax(x)
ex = exp(x);
y = ex / sum(ex);
end
Sigmoid函数定义如下:
function y = Sigmoid(x)
y = 1 ./ (1 + exp(-x));
end
测试代码如下:
clear all
rng(3);
X = zeros(5, 5, 5);
X(:, :, 1) = [ 0 1 1 0 0;
0 0 1 0 0;
0 0 1 0 0;
0 0 1 0 0;
0 1 1 1 0
];
X(:, :, 2) = [ 1 1 1 1 0;
0 0 0 0 1;
0 1 1 1 0;
1 0 0 0 0;
1 1 1 1 1
];
X(:, :, 3) = [ 1 1 1 1 0;
0 0 0 0 1;
0 1 1 1 0;
0 0 0 0 1;
1 1 1 1 0
];
X(:, :, 4) = [ 0 0 0 1 0;
0 0 1 1 0;
0 1 0 1 0;
1 1 1 1 1;
0 0 0 1 0
];
X(:, :, 5) = [ 1 1 1 1 1;
1 0 0 0 0;
1 1 1 1 0;
0 0 0 0 1;
1 1 1 1 0
];
D = [ 1 0 0 0 0;
0 1 0 0 0;
0 0 1 0 0;
0 0 0 1 0;
0 0 0 0 1
];
W1 = 2*rand(50, 25) - 1;
W2 = 2*rand( 5, 50) - 1;
for epoch = 1:10000 % train
[W1 W2] = MultiClass(W1, W2, X, D);
end
N = 5; % inference
for k = 1:N
x = reshape(X(:, :, k), 25, 1);
v1 = W1*x;
y1 = Sigmoid(v1);
v = W2*y1;
y = Softmax(v)
end
最终得到了正确的分类
2.微污染的多元分类示例
真实数据未必与训练数据相符,用微微污染的数据简单检验一下上面所构建的神经网络。代码如下:
clear all
TestMultiClass; % W1, W2
X = zeros(5, 5, 5);
X(:, :, 1) = [ 0 0 1 1 0;
0 0 1 1 0;
0 1 0 1 0;
0 0 0 1 0;
0 1 1 1 0
];
X(:, :, 2) = [ 1 1 1 1 0;
0 0 0 0 1;
0 1 1 1 0;
1 0 0 0 1;
1 1 1 1 1
];
X(:, :, 3) = [ 1 1 1 1 0;
0 0 0 0 1;
0 1 1 1 0;
1 0 0 0 1;
1 1 1 1 0
];
X(:, :, 4) = [ 0 1 1 1 0;
0 1 0 0 0;
0 1 1 1 0;
0 0 0 1 0;
0 1 1 1 0
];
X(:, :, 5) = [ 0 1 1 1 1;
0 1 0 0 0;
0 1 1 1 0;
0 0 0 1 0;
1 1 1 1 0
];
N = 5; % inference
for k = 1:N
x = reshape(X(:, :, k), 25, 1);
v1 = W1*x;
y1 = Sigmoid(v1);
v = W2*y1;
y = Softmax(v)
end
输出结果为 [0.0208,0.0006,0.0363,0.9164,0.0259] , [0.0000,0.9961,0.0038,0.0000,0.0000] ,[0.0001,0.0198,0.9798,0.0001,0.0002] ,[0.0930,0.3057,0.5397,0.0408,0.0208] ,[0.0363,0.3214,0.0717,0.0199,0.5506]。
以上代码中rng函数为:
function rng(x)
randn('seed', x)
rand('seed', x)
end