兵马逐浪去,将象入海流。炮声震心动,惊起卧龙游。
我的博客园主页 --------- 我的知乎主页 --------- 我的github主页 --------- 我的csdn主页 --------- 我的新浪微博

caffe搭建以及初步学习--win7-vs2013-gtx650tiboost-cuda8.0-cifar10训练和测试-2-快速解决方案cifar10_quick_solver.prototxt

 

首先安装好显卡----已经装好了?喜大普奔!没装好?那就用cpu,也是一样的。

拷贝cudnn v5.0 头文件和库文件以及执行文件到cuda8中

-----------------------------准备工作--------------------------------------

git clone  https://github.com/BVLC/caffe.git

git branch -a

git checkout windows

cmake-gui

configure +  vs2013 Win64

修改设置atlas选项为 open

build-malab  on

检查numpy是否安装,是否配置正确

configure  

generate

----------------------------build---------------------------------------

choose release mode

build all

reload and build all

build-install

set to path

-----------------------------wget----------------------------------------------

 wget --no-check-certificate http://www.cs.toronto.edu/~kriz/cifar-10-binary.tar.gz

./get_cifar10.sh   (只执行解压缩那部分的语句,其余的不要。)即:

    tar -xvf cifar-10-binary.tar.gz && rm -f cifar-10-binary.tar.gz
    mv cifar-10-batches-bin/* . && rm -rf cifar-10-batches-bin

   (放在了example目录下了)

 

 ====================================华丽的分割线--摘要===========================================

convert_cifar_data.exe data/cifar10 examples/cifar10 lmdb

compute_image_mean.exe -backend=lmdb examples/cifar10/cifar10_train_lmdb examples/cifar10/mean.binaryproto

 caffe train --solver=examples/cifar10/cifar10_quick_solver.prototxt

caffe train --solver=examples/cifar10/cifar10_quick_solver_lr1.prototxt --snapshot=examples/cifar10/cifar10_quick_iter_4000.solverstate

caffe test -model examples/cifar10/cifar10_quick_train_test.prototxt -weights examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5 -iterations 100

 

classification.exe examples/cifar10/cifar10_quick.prototxt examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5 examples/cifar10/mean.binaryproto data/cifar10/synset_words.txt examples/images/cat.jpg

classification.exe examples/cifar10/cifar10_quick.prototxt examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5 examples/cifar10/mean.binaryproto data/cifar10/synset_words.txt examples/images/fish-bike

 ---------------------------------------

synset_words.txt 的内容如下:

  1. airplane  
  2. automobile  
  3. bird  
  4. cat  
  5. deer  
  6. dog  
  7. frog  
  8. horse  
  9. ship  
  10. truck 

-------------------------------------------------------------------------------------------------------

ex1.  识别一只猫:

classification.exe  examples/cifar10/cifar10_quick.prototxt  examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5  examples/cifar10/mean.binaryproto  data/cifar10/synset_words.txt  examples/images/cat.jpg

 

---------- Prediction for examples/images/cat.jpg ----------
0.3606 - "cat "
0.3349 - "deer "
0.1377 - "dog "
0.0930 - "truck "
0.0485 - "horse "

classification.exe  examples/cifar10/cifar10_quick.prototxt  examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5  examples/cifar10/mean.binaryproto  data/cifar10/synset_words.txt  examples/images/fish-bike

 

 

---------- Prediction for examples/images/fish-bike.jpg ----------
0.9334 - "horse "
0.0268 - "airplane "
0.0148 - "deer "
0.0103 - "bird "
0.0090 - "ship "

 ----------------------------------------------------------------------------------------------------------------------------------------------

命令和参数的解释:

 

caffe train --solver=examples/cifar10/cifar10_quick_solver.prototxt
上述命令执行完毕后会生成cifar10_quick_iter_4000.caffemodel以及cifar10_quick_iter_4000.solverstate两个文件,位置在examples/cifar10/下面,这是确定的,不用人工指定。

其中cifar10_quick_iter_4000.solverstate将在进一步的训练中使用到,而cifar10_quick_iter_4000.caffemodel模型权值文件可用于数据集的测试

(此处可不用,因为还有下面更深层的训练,会生成更深层的模型权值文件cifar10_quick_iter_5000.caffemodel.h5)。


caffe train --solver=examples/cifar10/cifar10_quick_solver_lr1.prototxt --snapshot=examples/cifar10/cifar10_quick_iter_4000.solverstate 
上述命令执行完毕后会生成cifar10_quick_iter_5000.caffemodel.h5以及cifar10_quick_iter_5000.solverstate.h5两个文件,在此例子中,就是用cifar10_quick_iter_5000.caffemodel.h5模型权值文件进行预测的。
【必须注意的是】修改对应需要文件的路径(因为我们是在windows下执行的),否则会一直报错,可以根据报错内容修改对应的文件路径,需要修改的文件内容中的路径一般是cifar10_quick_solver.prototxt和cifar10_quick_solver_lr1.prototxt中的net以及snapshot_prefix对应的路径,还有cifar10_quick_train_test.prototxt中的mean_file和source对应的路径。事实上,本人没有做任何修改,直接运行即可。
下面将对配置文件中的内容进行简单说明或解释:
net:用于训练、预测的网络描述文件
test_iter:预测阶段迭代次数
test_interval:训练时每迭代多少次,进行一次预测
base_lr、momentum、weight_delay:网络的基础学习速率、冲量和权衰量
lr_policy:学习速率的衰减策略
display:每经过多少次迭代,在屏幕上打印一次运行日志
max_iter:最大迭代次数
snapshot:每多少次迭代打印一次快照
solver_mode:caffe的求解模式,根据实际情况选择GPU或CPU

 

caffe test -model examples/cifar10/cifar10_quick_train_test.prototxt -weights examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5 -iterations 100
其中:
test :表示只做预测(前向传播计算),不进行参数更新(后向传播计算)
-model examples/cifar10/cifar10_quick_train_test.prototxt:指定模型描述文本文件
-weights examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5 :指定模型权值文件,也就是预先训练出来的模型或者说权值文件
-iterations 100:指定迭代的次数,也就是参与测试的样本数目。

 

 

 ===================================超级华丽的分割线1============================================

 ===================================超级华丽的分割线2============================================

 ===================================超级华丽的分割线3============================================

 ===================================超级华丽的分割线4============================================

 =======================================    详情    ===============================================

convert_cifar_data.exe data/cifar10 examples/cifar10 lmdb

compute_image_mean.exe -backend=lmdb examples/cifar10/cifar10_train_lmdb examples/cifar10/mean.binaryproto

caffe.exe train --solver=examples/cifar10/cifar10_quick_solver.prototxt

I0703 15:08:42.303948 73968 net.cpp:137] Memory required for data: 1230000
I0703 15:08:42.303948 73968 layer_factory.cpp:58] Creating layer conv1
I0703 15:08:42.303948 73968 net.cpp:84] Creating Layer conv1
I0703 15:08:42.303948 73968 net.cpp:406] conv1 <- data
I0703 15:08:42.303948 73968 net.cpp:380] conv1 -> conv1
I0703 15:08:42.303948 73968 net.cpp:122] Setting up conv1
I0703 15:08:42.303948 73968 net.cpp:129] Top shape: 100 32 32 32 (3276800)
I0703 15:08:42.319548 73968 net.cpp:137] Memory required for data: 14337200
I0703 15:08:42.319548 73968 layer_factory.cpp:58] Creating layer pool1
I0703 15:08:42.319548 73968 net.cpp:84] Creating Layer pool1
I0703 15:08:42.319548 73968 net.cpp:406] pool1 <- conv1
I0703 15:08:42.319548 73968 net.cpp:380] pool1 -> pool1
I0703 15:08:42.319548 73968 net.cpp:122] Setting up pool1
I0703 15:08:42.319548 73968 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:08:42.319548 73968 net.cpp:137] Memory required for data: 17614000
I0703 15:08:42.319548 73968 layer_factory.cpp:58] Creating layer relu1
I0703 15:08:42.319548 73968 net.cpp:84] Creating Layer relu1
I0703 15:08:42.319548 73968 net.cpp:406] relu1 <- pool1
I0703 15:08:42.319548 73968 net.cpp:367] relu1 -> pool1 (in-place)
I0703 15:08:42.319548 73968 net.cpp:122] Setting up relu1
I0703 15:08:42.319548 73968 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:08:42.319548 73968 net.cpp:137] Memory required for data: 20890800
I0703 15:08:42.319548 73968 layer_factory.cpp:58] Creating layer conv2
I0703 15:08:42.319548 73968 net.cpp:84] Creating Layer conv2
I0703 15:08:42.319548 73968 net.cpp:406] conv2 <- pool1
I0703 15:08:42.319548 73968 net.cpp:380] conv2 -> conv2
I0703 15:08:42.319548 73968 net.cpp:122] Setting up conv2
I0703 15:08:42.319548 73968 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:08:42.319548 73968 net.cpp:137] Memory required for data: 24167600
I0703 15:08:42.319548 73968 layer_factory.cpp:58] Creating layer relu2
I0703 15:08:42.319548 73968 net.cpp:84] Creating Layer relu2
I0703 15:08:42.319548 73968 net.cpp:406] relu2 <- conv2
I0703 15:08:42.319548 73968 net.cpp:367] relu2 -> conv2 (in-place)
I0703 15:08:42.335150 73968 net.cpp:122] Setting up relu2
I0703 15:08:42.335150 73968 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:08:42.335150 73968 net.cpp:137] Memory required for data: 27444400
I0703 15:08:42.335150 73968 layer_factory.cpp:58] Creating layer pool2
I0703 15:08:42.335150 73968 net.cpp:84] Creating Layer pool2
I0703 15:08:42.335150 73968 net.cpp:406] pool2 <- conv2
I0703 15:08:42.335150 73968 net.cpp:380] pool2 -> pool2
I0703 15:08:42.335150 73968 net.cpp:122] Setting up pool2
I0703 15:08:42.335150 73968 net.cpp:129] Top shape: 100 32 8 8 (204800)
I0703 15:08:42.335150 73968 net.cpp:137] Memory required for data: 28263600
I0703 15:08:42.335150 73968 layer_factory.cpp:58] Creating layer conv3
I0703 15:08:42.335150 73968 net.cpp:84] Creating Layer conv3
I0703 15:08:42.335150 73968 net.cpp:406] conv3 <- pool2
I0703 15:08:42.335150 73968 net.cpp:380] conv3 -> conv3
I0703 15:08:42.335150 73968 net.cpp:122] Setting up conv3
I0703 15:08:42.335150 73968 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0703 15:08:42.335150 73968 net.cpp:137] Memory required for data: 29902000
I0703 15:08:42.335150 73968 layer_factory.cpp:58] Creating layer relu3
I0703 15:08:42.335150 73968 net.cpp:84] Creating Layer relu3
I0703 15:08:42.335150 73968 net.cpp:406] relu3 <- conv3
I0703 15:08:42.335150 73968 net.cpp:367] relu3 -> conv3 (in-place)
I0703 15:08:42.335150 73968 net.cpp:122] Setting up relu3
I0703 15:08:42.335150 73968 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0703 15:08:42.335150 73968 net.cpp:137] Memory required for data: 31540400
I0703 15:08:42.335150 73968 layer_factory.cpp:58] Creating layer pool3
I0703 15:08:42.350749 73968 net.cpp:84] Creating Layer pool3
I0703 15:08:42.350749 73968 net.cpp:406] pool3 <- conv3
I0703 15:08:42.350749 73968 net.cpp:380] pool3 -> pool3
I0703 15:08:42.350749 73968 net.cpp:122] Setting up pool3
I0703 15:08:42.350749 73968 net.cpp:129] Top shape: 100 64 4 4 (102400)
I0703 15:08:42.350749 73968 net.cpp:137] Memory required for data: 31950000
I0703 15:08:42.350749 73968 layer_factory.cpp:58] Creating layer ip1
I0703 15:08:42.350749 73968 net.cpp:84] Creating Layer ip1
I0703 15:08:42.350749 73968 net.cpp:406] ip1 <- pool3
I0703 15:08:42.350749 73968 net.cpp:380] ip1 -> ip1
I0703 15:08:42.350749 73968 net.cpp:122] Setting up ip1
I0703 15:08:42.350749 73968 net.cpp:129] Top shape: 100 64 (6400)
I0703 15:08:42.350749 73968 net.cpp:137] Memory required for data: 31975600
I0703 15:08:42.350749 73968 layer_factory.cpp:58] Creating layer ip2
I0703 15:08:42.350749 73968 net.cpp:84] Creating Layer ip2
I0703 15:08:42.350749 73968 net.cpp:406] ip2 <- ip1
I0703 15:08:42.350749 73968 net.cpp:380] ip2 -> ip2
I0703 15:08:42.350749 73968 net.cpp:122] Setting up ip2
I0703 15:08:42.350749 73968 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:08:42.350749 73968 net.cpp:137] Memory required for data: 31979600
I0703 15:08:42.350749 73968 layer_factory.cpp:58] Creating layer ip2_ip2_0_split
I0703 15:08:42.350749 73968 net.cpp:84] Creating Layer ip2_ip2_0_split
I0703 15:08:42.350749 73968 net.cpp:406] ip2_ip2_0_split <- ip2
I0703 15:08:42.350749 73968 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_0
I0703 15:08:42.350749 73968 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_1
I0703 15:08:42.350749 73968 net.cpp:122] Setting up ip2_ip2_0_split
I0703 15:08:42.350749 73968 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:08:42.350749 73968 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:08:42.350749 73968 net.cpp:137] Memory required for data: 31987600
I0703 15:08:42.350749 73968 layer_factory.cpp:58] Creating layer accuracy
I0703 15:08:42.350749 73968 net.cpp:84] Creating Layer accuracy
I0703 15:08:42.350749 73968 net.cpp:406] accuracy <- ip2_ip2_0_split_0
I0703 15:08:42.350749 73968 net.cpp:406] accuracy <- label_cifar_1_split_0
I0703 15:08:42.366350 73968 net.cpp:380] accuracy -> accuracy
I0703 15:08:42.366350 73968 net.cpp:122] Setting up accuracy
I0703 15:08:42.366350 73968 net.cpp:129] Top shape: (1)
I0703 15:08:42.366350 73968 net.cpp:137] Memory required for data: 31987604
I0703 15:08:42.366350 73968 layer_factory.cpp:58] Creating layer loss
I0703 15:08:42.366350 73968 net.cpp:84] Creating Layer loss
I0703 15:08:42.366350 73968 net.cpp:406] loss <- ip2_ip2_0_split_1
I0703 15:08:42.366350 73968 net.cpp:406] loss <- label_cifar_1_split_1
I0703 15:08:42.366350 73968 net.cpp:380] loss -> loss
I0703 15:08:42.366350 73968 layer_factory.cpp:58] Creating layer loss
I0703 15:08:42.366350 73968 net.cpp:122] Setting up loss
I0703 15:08:42.366350 73968 net.cpp:129] Top shape: (1)
I0703 15:08:42.366350 73968 net.cpp:132] with loss weight 1
I0703 15:08:42.366350 73968 net.cpp:137] Memory required for data: 31987608
I0703 15:08:42.366350 73968 net.cpp:198] loss needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:200] accuracy does not need backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] ip2_ip2_0_split needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] ip2 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] ip1 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] pool3 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] relu3 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] conv3 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] pool2 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] relu2 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] conv2 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] relu1 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] pool1 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:198] conv1 needs backward computation.
I0703 15:08:42.366350 73968 net.cpp:200] label_cifar_1_split does not need backward computation.
I0703 15:08:42.366350 73968 net.cpp:200] cifar does not need backward computation.
I0703 15:08:42.366350 73968 net.cpp:242] This network produces output accuracy
I0703 15:08:42.366350 73968 net.cpp:242] This network produces output loss
I0703 15:08:42.366350 73968 net.cpp:255] Network initialization done.
I0703 15:08:42.366350 73968 solver.cpp:56] Solver scaffolding done.
I0703 15:08:42.366350 73968 caffe.cpp:249] Starting Optimization
I0703 15:08:42.366350 73968 solver.cpp:272] Solving CIFAR10_quick
I0703 15:08:42.366350 73968 solver.cpp:273] Learning Rate Policy: fixed
I0703 15:08:42.413151 73968 solver.cpp:330] Iteration 0, Testing net (#0)
I0703 15:08:43.629990 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:08:43.661191 73968 solver.cpp:397] Test net output #0: accuracy = 0.0986
I0703 15:08:43.661191 73968 solver.cpp:397] Test net output #1: loss = 2.30244 (* 1 = 2.30244 loss)
I0703 15:08:43.707993 73968 solver.cpp:218] Iteration 0 (-5.42863e-042 iter/s, 1.29273s/100 iters), loss = 2.30274
I0703 15:08:43.707993 73968 solver.cpp:237] Train net output #0: loss = 2.30274 (* 1 = 2.30274 loss)
I0703 15:08:43.707993 73968 sgd_solver.cpp:105] Iteration 0, lr = 0.001
I0703 15:08:46.640887 73968 solver.cpp:218] Iteration 100 (34.1913 iter/s, 2.92472s/100 iters), loss = 1.65303
I0703 15:08:46.640887 73968 solver.cpp:237] Train net output #0: loss = 1.65303 (* 1 = 1.65303 loss)
I0703 15:08:46.640887 73968 sgd_solver.cpp:105] Iteration 100, lr = 0.001
I0703 15:08:49.563181 73968 solver.cpp:218] Iteration 200 (34.1799 iter/s, 2.9257s/100 iters), loss = 1.60865
I0703 15:08:49.563181 73968 solver.cpp:237] Train net output #0: loss = 1.60865 (* 1 = 1.60865 loss)
I0703 15:08:49.563181 73968 sgd_solver.cpp:105] Iteration 200, lr = 0.001
I0703 15:08:52.480474 73968 solver.cpp:218] Iteration 300 (34.2469 iter/s, 2.91998s/100 iters), loss = 1.185
I0703 15:08:52.480474 73968 solver.cpp:237] Train net output #0: loss = 1.185 (* 1 = 1.185 loss)
I0703 15:08:52.480474 73968 sgd_solver.cpp:105] Iteration 300, lr = 0.001
I0703 15:08:55.428969 73968 solver.cpp:218] Iteration 400 (34.0078 iter/s, 2.9405s/100 iters), loss = 1.22841
I0703 15:08:55.428969 73968 solver.cpp:237] Train net output #0: loss = 1.22841 (* 1 = 1.22841 loss)
I0703 15:08:55.428969 73968 sgd_solver.cpp:105] Iteration 400, lr = 0.001
I0703 15:08:58.253659 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:08:58.347262 73968 solver.cpp:330] Iteration 500, Testing net (#0)
I0703 15:08:59.486099 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:08:59.517300 73968 solver.cpp:397] Test net output #0: accuracy = 0.5535
I0703 15:08:59.517300 73968 solver.cpp:397] Test net output #1: loss = 1.2742 (* 1 = 1.2742 loss)
I0703 15:08:59.548501 73968 solver.cpp:218] Iteration 500 (24.261 iter/s, 4.12185s/100 iters), loss = 1.23252
I0703 15:08:59.548501 73968 solver.cpp:237] Train net output #0: loss = 1.23252 (* 1 = 1.23252 loss)
I0703 15:08:59.548501 73968 sgd_solver.cpp:105] Iteration 500, lr = 0.001
I0703 15:09:02.512596 73968 solver.cpp:218] Iteration 600 (33.7982 iter/s, 2.95873s/100 iters), loss = 1.25093
I0703 15:09:02.512596 73968 solver.cpp:237] Train net output #0: loss = 1.25093 (* 1 = 1.25093 loss)
I0703 15:09:02.512596 73968 sgd_solver.cpp:105] Iteration 600, lr = 0.001
I0703 15:09:05.493698 73968 solver.cpp:218] Iteration 700 (33.3689 iter/s, 2.99681s/100 iters), loss = 1.15823
I0703 15:09:05.494699 73968 solver.cpp:237] Train net output #0: loss = 1.15823 (* 1 = 1.15823 loss)
I0703 15:09:05.494699 73968 sgd_solver.cpp:105] Iteration 700, lr = 0.001
I0703 15:09:08.483223 73968 solver.cpp:218] Iteration 800 (33.6528 iter/s, 2.97152s/100 iters), loss = 1.02646
I0703 15:09:08.483223 73968 solver.cpp:237] Train net output #0: loss = 1.02646 (* 1 = 1.02646 loss)
I0703 15:09:08.483223 73968 sgd_solver.cpp:105] Iteration 800, lr = 0.001
I0703 15:09:11.447319 73968 solver.cpp:218] Iteration 900 (33.8917 iter/s, 2.95058s/100 iters), loss = 1.09516
I0703 15:09:11.447319 73968 solver.cpp:237] Train net output #0: loss = 1.09516 (* 1 = 1.09516 loss)
I0703 15:09:11.447319 73968 sgd_solver.cpp:105] Iteration 900, lr = 0.001
I0703 15:09:14.285416 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:09:14.394620 73968 solver.cpp:330] Iteration 1000, Testing net (#0)
I0703 15:09:15.551057 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:09:15.597858 73968 solver.cpp:397] Test net output #0: accuracy = 0.6325
I0703 15:09:15.597858 73968 solver.cpp:397] Test net output #1: loss = 1.05842 (* 1 = 1.05842 loss)
I0703 15:09:15.629060 73968 solver.cpp:218] Iteration 1000 (23.9025 iter/s, 4.18367s/100 iters), loss = 0.983432
I0703 15:09:15.629060 73968 solver.cpp:237] Train net output #0: loss = 0.983432 (* 1 = 0.983432 loss)
I0703 15:09:15.629060 73968 sgd_solver.cpp:105] Iteration 1000, lr = 0.001
I0703 15:09:18.599957 73968 solver.cpp:218] Iteration 1100 (33.8006 iter/s, 2.95853s/100 iters), loss = 1.06141
I0703 15:09:18.599957 73968 solver.cpp:237] Train net output #0: loss = 1.06141 (* 1 = 1.06141 loss)
I0703 15:09:18.599957 73968 sgd_solver.cpp:105] Iteration 1100, lr = 0.001
I0703 15:09:19.317580 73968 blocking_queue.cpp:49] Waiting for data
I0703 15:09:21.548452 73968 solver.cpp:218] Iteration 1200 (33.9381 iter/s, 2.94654s/100 iters), loss = 0.950789
I0703 15:09:21.548452 73968 solver.cpp:237] Train net output #0: loss = 0.950789 (* 1 = 0.950789 loss)
I0703 15:09:21.548452 73968 sgd_solver.cpp:105] Iteration 1200, lr = 0.001
I0703 15:09:24.512547 73968 solver.cpp:218] Iteration 1300 (33.7456 iter/s, 2.96335s/100 iters), loss = 0.845029
I0703 15:09:24.512547 73968 solver.cpp:237] Train net output #0: loss = 0.845029 (* 1 = 0.845029 loss)
I0703 15:09:24.512547 73968 sgd_solver.cpp:105] Iteration 1300, lr = 0.001
I0703 15:09:27.538540 73968 solver.cpp:218] Iteration 1400 (33.0713 iter/s, 3.02377s/100 iters), loss = 0.854708
I0703 15:09:27.538540 73968 solver.cpp:237] Train net output #0: loss = 0.854708 (* 1 = 0.854708 loss)
I0703 15:09:27.538540 73968 sgd_solver.cpp:105] Iteration 1400, lr = 0.001
I0703 15:09:30.405486 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:09:30.499089 73968 solver.cpp:330] Iteration 1500, Testing net (#0)
I0703 15:09:31.642729 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:09:31.689529 73968 solver.cpp:397] Test net output #0: accuracy = 0.666
I0703 15:09:31.689529 73968 solver.cpp:397] Test net output #1: loss = 0.965108 (* 1 = 0.965108 loss)
I0703 15:09:31.706130 73968 solver.cpp:218] Iteration 1500 (23.9048 iter/s, 4.18326s/100 iters), loss = 0.803068
I0703 15:09:31.706130 73968 solver.cpp:237] Train net output #0: loss = 0.803068 (* 1 = 0.803068 loss)
I0703 15:09:31.706130 73968 sgd_solver.cpp:105] Iteration 1500, lr = 0.001
I0703 15:09:34.671032 73968 solver.cpp:218] Iteration 1600 (33.9581 iter/s, 2.94481s/100 iters), loss = 0.89545
I0703 15:09:34.671032 73968 solver.cpp:237] Train net output #0: loss = 0.89545 (* 1 = 0.89545 loss)
I0703 15:09:34.671032 73968 sgd_solver.cpp:105] Iteration 1600, lr = 0.001
I0703 15:09:37.666788 73968 solver.cpp:218] Iteration 1700 (33.1725 iter/s, 3.01455s/100 iters), loss = 0.858076
I0703 15:09:37.667788 73968 solver.cpp:237] Train net output #0: loss = 0.858076 (* 1 = 0.858076 loss)
I0703 15:09:37.667788 73968 sgd_solver.cpp:105] Iteration 1700, lr = 0.001
I0703 15:09:40.681776 73968 solver.cpp:218] Iteration 1800 (33.4777 iter/s, 2.98707s/100 iters), loss = 0.739417
I0703 15:09:40.681776 73968 solver.cpp:237] Train net output #0: loss = 0.739417 (* 1 = 0.739417 loss)
I0703 15:09:40.681776 73968 sgd_solver.cpp:105] Iteration 1800, lr = 0.001
I0703 15:09:43.649132 73968 solver.cpp:218] Iteration 1900 (33.6842 iter/s, 2.96875s/100 iters), loss = 0.755557
I0703 15:09:43.649132 73968 solver.cpp:237] Train net output #0: loss = 0.755557 (* 1 = 0.755557 loss)
I0703 15:09:43.649132 73968 sgd_solver.cpp:105] Iteration 1900, lr = 0.001
I0703 15:09:46.534909 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:09:46.628511 73968 solver.cpp:330] Iteration 2000, Testing net (#0)
I0703 15:09:47.772374 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:09:47.819176 73968 solver.cpp:397] Test net output #0: accuracy = 0.7016
I0703 15:09:47.819176 73968 solver.cpp:397] Test net output #1: loss = 0.878213 (* 1 = 0.878213 loss)
I0703 15:09:47.850378 73968 solver.cpp:218] Iteration 2000 (23.8369 iter/s, 4.19518s/100 iters), loss = 0.678401
I0703 15:09:47.850378 73968 solver.cpp:237] Train net output #0: loss = 0.678401 (* 1 = 0.678401 loss)
I0703 15:09:47.850378 73968 sgd_solver.cpp:105] Iteration 2000, lr = 0.001
I0703 15:09:50.791185 73968 solver.cpp:218] Iteration 2100 (33.7445 iter/s, 2.96345s/100 iters), loss = 0.790343
I0703 15:09:50.792186 73968 solver.cpp:237] Train net output #0: loss = 0.790343 (* 1 = 0.790343 loss)
I0703 15:09:50.792186 73968 sgd_solver.cpp:105] Iteration 2100, lr = 0.001
I0703 15:09:53.723479 73968 solver.cpp:218] Iteration 2200 (34.139 iter/s, 2.9292s/100 iters), loss = 0.792828
I0703 15:09:53.723479 73968 solver.cpp:237] Train net output #0: loss = 0.792828 (* 1 = 0.792828 loss)
I0703 15:09:53.723479 73968 sgd_solver.cpp:105] Iteration 2200, lr = 0.001
I0703 15:09:56.723778 73968 solver.cpp:218] Iteration 2300 (33.3467 iter/s, 2.9988s/100 iters), loss = 0.634268
I0703 15:09:56.724778 73968 solver.cpp:237] Train net output #0: loss = 0.634268 (* 1 = 0.634268 loss)
I0703 15:09:56.724778 73968 sgd_solver.cpp:105] Iteration 2300, lr = 0.001
I0703 15:09:59.751081 73968 solver.cpp:218] Iteration 2400 (33.0722 iter/s, 3.02369s/100 iters), loss = 0.730874
I0703 15:09:59.752081 73968 solver.cpp:237] Train net output #0: loss = 0.730874 (* 1 = 0.730874 loss)
I0703 15:09:59.752081 73968 sgd_solver.cpp:105] Iteration 2400, lr = 0.001
I0703 15:10:02.642370 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:02.746381 73968 solver.cpp:330] Iteration 2500, Testing net (#0)
I0703 15:10:03.921499 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:03.966502 73968 solver.cpp:397] Test net output #0: accuracy = 0.7126
I0703 15:10:03.966502 73968 solver.cpp:397] Test net output #1: loss = 0.845339 (* 1 = 0.845339 loss)
I0703 15:10:03.996505 73968 solver.cpp:218] Iteration 2500 (23.57 iter/s, 4.24269s/100 iters), loss = 0.613872
I0703 15:10:03.997506 73968 solver.cpp:237] Train net output #0: loss = 0.613872 (* 1 = 0.613872 loss)
I0703 15:10:03.997506 73968 sgd_solver.cpp:105] Iteration 2500, lr = 0.001
I0703 15:10:06.947801 73968 solver.cpp:218] Iteration 2600 (33.9124 iter/s, 2.94878s/100 iters), loss = 0.713529
I0703 15:10:06.948801 73968 solver.cpp:237] Train net output #0: loss = 0.713529 (* 1 = 0.713529 loss)
I0703 15:10:06.949801 73968 sgd_solver.cpp:105] Iteration 2600, lr = 0.001
I0703 15:10:09.897095 73968 solver.cpp:218] Iteration 2700 (33.94 iter/s, 2.94638s/100 iters), loss = 0.749147
I0703 15:10:09.898097 73968 solver.cpp:237] Train net output #0: loss = 0.749147 (* 1 = 0.749147 loss)
I0703 15:10:09.898097 73968 sgd_solver.cpp:105] Iteration 2700, lr = 0.001
I0703 15:10:12.863761 73968 solver.cpp:218] Iteration 2800 (33.7369 iter/s, 2.96411s/100 iters), loss = 0.570446
I0703 15:10:12.864761 73968 solver.cpp:237] Train net output #0: loss = 0.570446 (* 1 = 0.570446 loss)
I0703 15:10:12.865762 73968 sgd_solver.cpp:105] Iteration 2800, lr = 0.001
I0703 15:10:15.809679 73968 solver.cpp:218] Iteration 2900 (33.9821 iter/s, 2.94273s/100 iters), loss = 0.713307
I0703 15:10:15.810678 73968 solver.cpp:237] Train net output #0: loss = 0.713307 (* 1 = 0.713307 loss)
I0703 15:10:15.811678 73968 sgd_solver.cpp:105] Iteration 2900, lr = 0.001
I0703 15:10:18.606696 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:18.704706 73968 solver.cpp:330] Iteration 3000, Testing net (#0)
I0703 15:10:19.830818 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:19.873823 73968 solver.cpp:397] Test net output #0: accuracy = 0.7197
I0703 15:10:19.874824 73968 solver.cpp:397] Test net output #1: loss = 0.835526 (* 1 = 0.835526 loss)
I0703 15:10:19.903826 73968 solver.cpp:218] Iteration 3000 (24.4437 iter/s, 4.09103s/100 iters), loss = 0.57974
I0703 15:10:19.904826 73968 solver.cpp:237] Train net output #0: loss = 0.57974 (* 1 = 0.57974 loss)
I0703 15:10:19.905827 73968 sgd_solver.cpp:105] Iteration 3000, lr = 0.001
I0703 15:10:22.878123 73968 solver.cpp:218] Iteration 3100 (33.657 iter/s, 2.97115s/100 iters), loss = 0.664127
I0703 15:10:22.879123 73968 solver.cpp:237] Train net output #0: loss = 0.664127 (* 1 = 0.664127 loss)
I0703 15:10:22.879123 73968 sgd_solver.cpp:105] Iteration 3100, lr = 0.001
I0703 15:10:25.912427 73968 solver.cpp:218] Iteration 3200 (32.9824 iter/s, 3.03192s/100 iters), loss = 0.726144
I0703 15:10:25.913427 73968 solver.cpp:237] Train net output #0: loss = 0.726144 (* 1 = 0.726144 loss)
I0703 15:10:25.914427 73968 sgd_solver.cpp:105] Iteration 3200, lr = 0.001
I0703 15:10:28.902726 73968 solver.cpp:218] Iteration 3300 (33.4837 iter/s, 2.98653s/100 iters), loss = 0.597564
I0703 15:10:28.902726 73968 solver.cpp:237] Train net output #0: loss = 0.597564 (* 1 = 0.597564 loss)
I0703 15:10:28.903726 73968 sgd_solver.cpp:105] Iteration 3300, lr = 0.001
I0703 15:10:31.947031 73968 solver.cpp:218] Iteration 3400 (32.872 iter/s, 3.0421s/100 iters), loss = 0.663627
I0703 15:10:31.948030 73968 solver.cpp:237] Train net output #0: loss = 0.663627 (* 1 = 0.663627 loss)
I0703 15:10:31.948030 73968 sgd_solver.cpp:105] Iteration 3400, lr = 0.001
I0703 15:10:34.875890 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:34.976495 73968 solver.cpp:330] Iteration 3500, Testing net (#0)
I0703 15:10:36.121947 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:36.185348 73968 solver.cpp:397] Test net output #0: accuracy = 0.7187
I0703 15:10:36.185348 73968 solver.cpp:397] Test net output #1: loss = 0.862067 (* 1 = 0.862067 loss)
I0703 15:10:36.216549 73968 solver.cpp:218] Iteration 3500 (23.5687 iter/s, 4.24291s/100 iters), loss = 0.560303
I0703 15:10:36.216549 73968 solver.cpp:237] Train net output #0: loss = 0.560303 (* 1 = 0.560303 loss)
I0703 15:10:36.216549 73968 sgd_solver.cpp:105] Iteration 3500, lr = 0.001
I0703 15:10:39.186511 73968 solver.cpp:218] Iteration 3600 (33.6513 iter/s, 2.97165s/100 iters), loss = 0.619981
I0703 15:10:39.186511 73968 solver.cpp:237] Train net output #0: loss = 0.619981 (* 1 = 0.619981 loss)
I0703 15:10:39.186511 73968 sgd_solver.cpp:105] Iteration 3600, lr = 0.001
I0703 15:10:42.190245 73968 solver.cpp:218] Iteration 3700 (33.4028 iter/s, 2.99376s/100 iters), loss = 0.798935
I0703 15:10:42.190245 73968 solver.cpp:237] Train net output #0: loss = 0.798935 (* 1 = 0.798935 loss)
I0703 15:10:42.190245 73968 sgd_solver.cpp:105] Iteration 3700, lr = 0.001
I0703 15:10:45.185573 73968 solver.cpp:218] Iteration 3800 (33.3392 iter/s, 2.99947s/100 iters), loss = 0.563457
I0703 15:10:45.185573 73968 solver.cpp:237] Train net output #0: loss = 0.563457 (* 1 = 0.563457 loss)
I0703 15:10:45.185573 73968 sgd_solver.cpp:105] Iteration 3800, lr = 0.001
I0703 15:10:48.180902 73968 solver.cpp:218] Iteration 3900 (33.4396 iter/s, 2.99047s/100 iters), loss = 0.629793
I0703 15:10:48.180902 73968 solver.cpp:237] Train net output #0: loss = 0.629793 (* 1 = 0.629793 loss)
I0703 15:10:48.180902 73968 sgd_solver.cpp:105] Iteration 3900, lr = 0.001
I0703 15:10:51.034423 71148 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:51.128026 73968 solver.cpp:447] Snapshotting to binary proto file examples/cifar10/cifar10_quick_iter_4000.caffemodel
I0703 15:10:51.159227 73968 sgd_solver.cpp:273] Snapshotting solver state to binary proto file examples/cifar10/cifar10_quick_iter_4000.solverstate
I0703 15:10:51.174827 73968 solver.cpp:310] Iteration 4000, loss = 0.518084
I0703 15:10:51.174827 73968 solver.cpp:330] Iteration 4000, Testing net (#0)
I0703 15:10:52.338073 70280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:10:52.378077 73968 solver.cpp:397] Test net output #0: accuracy = 0.7254
I0703 15:10:52.379077 73968 solver.cpp:397] Test net output #1: loss = 0.841199 (* 1 = 0.841199 loss)
I0703 15:10:52.380077 73968 solver.cpp:315] Optimization Done.
I0703 15:10:52.380077 73968 caffe.cpp:260] Optimization Done.

D:\ws_caffe\caffe>

-----------------------------------------------------------------------------------------------------------

caffe train --solver=examples/cifar10/cifar10_quick_solver_lr1.prototxt --snapshot=examples/cifar10/cifar10_quick_iter_4000.solverstate

type: "Convolution"
bottom: "pool1"
top: "conv2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 32
pad: 2
kernel_size: 5
stride: 1
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "relu2"
type: "ReLU"
bottom: "conv2"
top: "conv2"
}
layer {
name: "pool2"
type: "Pooling"
bottom: "conv2"
top: "pool2"
pooling_param {
pool: AVE
kernel_size: 3
stride: 2
}
}
layer {
name: "conv3"
type: "Convolution"
bottom: "pool2"
top: "conv3"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 64
pad: 2
kernel_size: 5
stride: 1
weight_filler {
type: "gaussian"
std: 0.01
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "relu3"
type: "ReLU"
bottom: "conv3"
top: "conv3"
}
layer {
name: "pool3"
type: "Pooling"
bottom: "conv3"
top: "pool3"
pooling_param {
pool: AVE
kernel_size: 3
stride: 2
}
}
layer {
name: "ip1"
type: "InnerProduct"
bottom: "pool3"
top: "ip1"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 64
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "ip2"
type: "InnerProduct"
bottom: "ip1"
top: "ip2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 10
weight_filler {
type: "gaussian"
std: 0.1
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "accuracy"
type: "Accuracy"
bottom: "ip2"
bottom: "label"
top: "accuracy"
include {
phase: TEST
}
}
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "ip2"
bottom: "label"
top: "loss"
}
I0703 15:30:21.819789 74500 layer_factory.cpp:58] Creating layer cifar
I0703 15:30:21.819789 74500 db_lmdb.cpp:40] Opened lmdb examples/cifar10/cifar10_test_lmdb
I0703 15:30:21.819789 74500 net.cpp:84] Creating Layer cifar
I0703 15:30:21.819789 74500 net.cpp:380] cifar -> data
I0703 15:30:21.819789 74500 net.cpp:380] cifar -> label
I0703 15:30:21.819789 74500 data_transformer.cpp:25] Loading mean file from: examples/cifar10/mean.binaryproto
I0703 15:30:21.835391 74500 data_layer.cpp:45] output data size: 100,3,32,32
I0703 15:30:21.835391 74500 net.cpp:122] Setting up cifar
I0703 15:30:21.835391 74500 net.cpp:129] Top shape: 100 3 32 32 (307200)
I0703 15:30:21.835391 74500 net.cpp:129] Top shape: 100 (100)
I0703 15:30:21.835391 74500 net.cpp:137] Memory required for data: 1229200
I0703 15:30:21.835391 74500 layer_factory.cpp:58] Creating layer label_cifar_1_split
I0703 15:30:21.835391 69056 common.cpp:36] System entropy source not available, using fallback algorithm to generate seed instead.
I0703 15:30:21.835391 74500 net.cpp:84] Creating Layer label_cifar_1_split
I0703 15:30:21.835391 74500 net.cpp:406] label_cifar_1_split <- label
I0703 15:30:21.835391 74500 net.cpp:380] label_cifar_1_split -> label_cifar_1_split_0
I0703 15:30:21.835391 74500 net.cpp:380] label_cifar_1_split -> label_cifar_1_split_1
I0703 15:30:21.850993 74500 net.cpp:122] Setting up label_cifar_1_split
I0703 15:30:21.850993 74500 net.cpp:129] Top shape: 100 (100)
I0703 15:30:21.850993 74500 net.cpp:129] Top shape: 100 (100)
I0703 15:30:21.850993 74500 net.cpp:137] Memory required for data: 1230000
I0703 15:30:21.850993 74500 layer_factory.cpp:58] Creating layer conv1
I0703 15:30:21.850993 74500 net.cpp:84] Creating Layer conv1
I0703 15:30:21.850993 74500 net.cpp:406] conv1 <- data
I0703 15:30:21.850993 74500 net.cpp:380] conv1 -> conv1
I0703 15:30:21.850993 74500 net.cpp:122] Setting up conv1
I0703 15:30:21.850993 74500 net.cpp:129] Top shape: 100 32 32 32 (3276800)
I0703 15:30:21.850993 74500 net.cpp:137] Memory required for data: 14337200
I0703 15:30:21.850993 74500 layer_factory.cpp:58] Creating layer pool1
I0703 15:30:21.850993 74500 net.cpp:84] Creating Layer pool1
I0703 15:30:21.850993 74500 net.cpp:406] pool1 <- conv1
I0703 15:30:21.850993 74500 net.cpp:380] pool1 -> pool1
I0703 15:30:21.850993 74500 net.cpp:122] Setting up pool1
I0703 15:30:21.850993 74500 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:30:21.850993 74500 net.cpp:137] Memory required for data: 17614000
I0703 15:30:21.850993 74500 layer_factory.cpp:58] Creating layer relu1
I0703 15:30:21.850993 74500 net.cpp:84] Creating Layer relu1
I0703 15:30:21.850993 74500 net.cpp:406] relu1 <- pool1
I0703 15:30:21.850993 74500 net.cpp:367] relu1 -> pool1 (in-place)
I0703 15:30:21.850993 74500 net.cpp:122] Setting up relu1
I0703 15:30:21.850993 74500 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:30:21.850993 74500 net.cpp:137] Memory required for data: 20890800
I0703 15:30:21.850993 74500 layer_factory.cpp:58] Creating layer conv2
I0703 15:30:21.850993 74500 net.cpp:84] Creating Layer conv2
I0703 15:30:21.850993 74500 net.cpp:406] conv2 <- pool1
I0703 15:30:21.850993 74500 net.cpp:380] conv2 -> conv2
I0703 15:30:21.866595 74500 net.cpp:122] Setting up conv2
I0703 15:30:21.866595 74500 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:30:21.866595 74500 net.cpp:137] Memory required for data: 24167600
I0703 15:30:21.866595 74500 layer_factory.cpp:58] Creating layer relu2
I0703 15:30:21.866595 74500 net.cpp:84] Creating Layer relu2
I0703 15:30:21.866595 74500 net.cpp:406] relu2 <- conv2
I0703 15:30:21.866595 74500 net.cpp:367] relu2 -> conv2 (in-place)
I0703 15:30:21.866595 74500 net.cpp:122] Setting up relu2
I0703 15:30:21.866595 74500 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:30:21.866595 74500 net.cpp:137] Memory required for data: 27444400
I0703 15:30:21.866595 74500 layer_factory.cpp:58] Creating layer pool2
I0703 15:30:21.866595 74500 net.cpp:84] Creating Layer pool2
I0703 15:30:21.866595 74500 net.cpp:406] pool2 <- conv2
I0703 15:30:21.866595 74500 net.cpp:380] pool2 -> pool2
I0703 15:30:21.866595 74500 net.cpp:122] Setting up pool2
I0703 15:30:21.866595 74500 net.cpp:129] Top shape: 100 32 8 8 (204800)
I0703 15:30:21.866595 74500 net.cpp:137] Memory required for data: 28263600
I0703 15:30:21.866595 74500 layer_factory.cpp:58] Creating layer conv3
I0703 15:30:21.866595 74500 net.cpp:84] Creating Layer conv3
I0703 15:30:21.866595 74500 net.cpp:406] conv3 <- pool2
I0703 15:30:21.866595 74500 net.cpp:380] conv3 -> conv3
I0703 15:30:21.882197 74500 net.cpp:122] Setting up conv3
I0703 15:30:21.882197 74500 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0703 15:30:21.882197 74500 net.cpp:137] Memory required for data: 29902000
I0703 15:30:21.882197 74500 layer_factory.cpp:58] Creating layer relu3
I0703 15:30:21.882197 74500 net.cpp:84] Creating Layer relu3
I0703 15:30:21.882197 74500 net.cpp:406] relu3 <- conv3
I0703 15:30:21.882197 74500 net.cpp:367] relu3 -> conv3 (in-place)
I0703 15:30:21.882197 74500 net.cpp:122] Setting up relu3
I0703 15:30:21.882197 74500 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0703 15:30:21.882197 74500 net.cpp:137] Memory required for data: 31540400
I0703 15:30:21.882197 74500 layer_factory.cpp:58] Creating layer pool3
I0703 15:30:21.882197 74500 net.cpp:84] Creating Layer pool3
I0703 15:30:21.882197 74500 net.cpp:406] pool3 <- conv3
I0703 15:30:21.882197 74500 net.cpp:380] pool3 -> pool3
I0703 15:30:21.882197 74500 net.cpp:122] Setting up pool3
I0703 15:30:21.882197 74500 net.cpp:129] Top shape: 100 64 4 4 (102400)
I0703 15:30:21.882197 74500 net.cpp:137] Memory required for data: 31950000
I0703 15:30:21.882197 74500 layer_factory.cpp:58] Creating layer ip1
I0703 15:30:21.882197 74500 net.cpp:84] Creating Layer ip1
I0703 15:30:21.882197 74500 net.cpp:406] ip1 <- pool3
I0703 15:30:21.882197 74500 net.cpp:380] ip1 -> ip1
I0703 15:30:21.882197 74500 net.cpp:122] Setting up ip1
I0703 15:30:21.897799 74500 net.cpp:129] Top shape: 100 64 (6400)
I0703 15:30:21.897799 74500 net.cpp:137] Memory required for data: 31975600
I0703 15:30:21.897799 74500 layer_factory.cpp:58] Creating layer ip2
I0703 15:30:21.897799 74500 net.cpp:84] Creating Layer ip2
I0703 15:30:21.897799 74500 net.cpp:406] ip2 <- ip1
I0703 15:30:21.897799 74500 net.cpp:380] ip2 -> ip2
I0703 15:30:21.897799 74500 net.cpp:122] Setting up ip2
I0703 15:30:21.897799 74500 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:30:21.897799 74500 net.cpp:137] Memory required for data: 31979600
I0703 15:30:21.897799 74500 layer_factory.cpp:58] Creating layer ip2_ip2_0_split
I0703 15:30:21.897799 74500 net.cpp:84] Creating Layer ip2_ip2_0_split
I0703 15:30:21.897799 74500 net.cpp:406] ip2_ip2_0_split <- ip2
I0703 15:30:21.897799 74500 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_0
I0703 15:30:21.897799 74500 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_1
I0703 15:30:21.897799 74500 net.cpp:122] Setting up ip2_ip2_0_split
I0703 15:30:21.897799 74500 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:30:21.897799 74500 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:30:21.897799 74500 net.cpp:137] Memory required for data: 31987600
I0703 15:30:21.897799 74500 layer_factory.cpp:58] Creating layer accuracy
I0703 15:30:21.897799 74500 net.cpp:84] Creating Layer accuracy
I0703 15:30:21.897799 74500 net.cpp:406] accuracy <- ip2_ip2_0_split_0
I0703 15:30:21.897799 74500 net.cpp:406] accuracy <- label_cifar_1_split_0
I0703 15:30:21.897799 74500 net.cpp:380] accuracy -> accuracy
I0703 15:30:21.897799 74500 net.cpp:122] Setting up accuracy
I0703 15:30:21.897799 74500 net.cpp:129] Top shape: (1)
I0703 15:30:21.897799 74500 net.cpp:137] Memory required for data: 31987604
I0703 15:30:21.897799 74500 layer_factory.cpp:58] Creating layer loss
I0703 15:30:21.897799 74500 net.cpp:84] Creating Layer loss
I0703 15:30:21.897799 74500 net.cpp:406] loss <- ip2_ip2_0_split_1
I0703 15:30:21.897799 74500 net.cpp:406] loss <- label_cifar_1_split_1
I0703 15:30:21.897799 74500 net.cpp:380] loss -> loss
I0703 15:30:21.897799 74500 layer_factory.cpp:58] Creating layer loss
I0703 15:30:21.897799 74500 net.cpp:122] Setting up loss
I0703 15:30:21.897799 74500 net.cpp:129] Top shape: (1)
I0703 15:30:21.897799 74500 net.cpp:132] with loss weight 1
I0703 15:30:21.897799 74500 net.cpp:137] Memory required for data: 31987608
I0703 15:30:21.913401 74500 net.cpp:198] loss needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:200] accuracy does not need backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] ip2_ip2_0_split needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] ip2 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] ip1 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] pool3 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] relu3 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] conv3 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] pool2 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] relu2 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] conv2 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] relu1 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] pool1 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] conv1 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:200] label_cifar_1_split does not need backward computation.
I0703 15:30:21.913401 74500 net.cpp:200] cifar does not need backward computation.
I0703 15:30:21.913401 74500 net.cpp:242] This network produces output accuracy
I0703 15:30:21.913401 74500 net.cpp:242] This network produces output loss
I0703 15:30:21.913401 74500 net.cpp:255] Network initialization done.
I0703 15:30:21.913401 74500 solver.cpp:56] Solver scaffolding done.
I0703 15:30:21.913401 74500 caffe.cpp:249] Starting Optimization
I0703 15:30:21.913401 74500 solver.cpp:272] Solving CIFAR10_quick
I0703 15:30:21.913401 74500 solver.cpp:273] Learning Rate Policy: fixed
I0703 15:30:21.913401 74500 solver.cpp:330] Iteration 0, Testing net (#0)
I0703 15:30:22.287849 74500 blocking_queue.cpp:49] Waiting for data
I0703 15:30:23.161561 69056 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:30:23.192765 74500 solver.cpp:397] Test net output #0: accuracy = 0.1001
I0703 15:30:23.192765 74500 solver.cpp:397] Test net output #1: loss = 2.30252 (* 1 = 2.30252 loss)
I0703 15:30:23.223969 74500 solver.cpp:218] Iteration 0 (0 iter/s, 1.31383s/100 iters), loss = 2.30354
I0703 15:30:23.223969 74500 solver.cpp:237] Train net output #0: loss = 2.30354 (* 1 = 2.30354 loss)
I0703 15:30:23.223969 74500 sgd_solver.cpp:105] Iteration 0, lr = 0.001

 

 

I0703 15:30:21.850993 74500 net.cpp:137] Memory required for data: 1230000
I0703 15:30:21.850993 74500 layer_factory.cpp:58] Creating layer conv1
I0703 15:30:21.850993 74500 net.cpp:84] Creating Layer conv1
I0703 15:30:21.850993 74500 net.cpp:406] conv1 <- data
I0703 15:30:21.850993 74500 net.cpp:380] conv1 -> conv1
I0703 15:30:21.850993 74500 net.cpp:122] Setting up conv1
I0703 15:30:21.850993 74500 net.cpp:129] Top shape: 100 32 32 32 (3276800)
I0703 15:30:21.850993 74500 net.cpp:137] Memory required for data: 14337200
I0703 15:30:21.850993 74500 layer_factory.cpp:58] Creating layer pool1
I0703 15:30:21.850993 74500 net.cpp:84] Creating Layer pool1
I0703 15:30:21.850993 74500 net.cpp:406] pool1 <- conv1
I0703 15:30:21.850993 74500 net.cpp:380] pool1 -> pool1
I0703 15:30:21.850993 74500 net.cpp:122] Setting up pool1
I0703 15:30:21.850993 74500 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:30:21.850993 74500 net.cpp:137] Memory required for data: 17614000
I0703 15:30:21.850993 74500 layer_factory.cpp:58] Creating layer relu1
I0703 15:30:21.850993 74500 net.cpp:84] Creating Layer relu1
I0703 15:30:21.850993 74500 net.cpp:406] relu1 <- pool1
I0703 15:30:21.850993 74500 net.cpp:367] relu1 -> pool1 (in-place)
I0703 15:30:21.850993 74500 net.cpp:122] Setting up relu1
I0703 15:30:21.850993 74500 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:30:21.850993 74500 net.cpp:137] Memory required for data: 20890800
I0703 15:30:21.850993 74500 layer_factory.cpp:58] Creating layer conv2
I0703 15:30:21.850993 74500 net.cpp:84] Creating Layer conv2
I0703 15:30:21.850993 74500 net.cpp:406] conv2 <- pool1
I0703 15:30:21.850993 74500 net.cpp:380] conv2 -> conv2
I0703 15:30:21.866595 74500 net.cpp:122] Setting up conv2
I0703 15:30:21.866595 74500 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:30:21.866595 74500 net.cpp:137] Memory required for data: 24167600
I0703 15:30:21.866595 74500 layer_factory.cpp:58] Creating layer relu2
I0703 15:30:21.866595 74500 net.cpp:84] Creating Layer relu2
I0703 15:30:21.866595 74500 net.cpp:406] relu2 <- conv2
I0703 15:30:21.866595 74500 net.cpp:367] relu2 -> conv2 (in-place)
I0703 15:30:21.866595 74500 net.cpp:122] Setting up relu2
I0703 15:30:21.866595 74500 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:30:21.866595 74500 net.cpp:137] Memory required for data: 27444400
I0703 15:30:21.866595 74500 layer_factory.cpp:58] Creating layer pool2
I0703 15:30:21.866595 74500 net.cpp:84] Creating Layer pool2
I0703 15:30:21.866595 74500 net.cpp:406] pool2 <- conv2
I0703 15:30:21.866595 74500 net.cpp:380] pool2 -> pool2
I0703 15:30:21.866595 74500 net.cpp:122] Setting up pool2
I0703 15:30:21.866595 74500 net.cpp:129] Top shape: 100 32 8 8 (204800)
I0703 15:30:21.866595 74500 net.cpp:137] Memory required for data: 28263600
I0703 15:30:21.866595 74500 layer_factory.cpp:58] Creating layer conv3
I0703 15:30:21.866595 74500 net.cpp:84] Creating Layer conv3
I0703 15:30:21.866595 74500 net.cpp:406] conv3 <- pool2
I0703 15:30:21.866595 74500 net.cpp:380] conv3 -> conv3
I0703 15:30:21.882197 74500 net.cpp:122] Setting up conv3
I0703 15:30:21.882197 74500 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0703 15:30:21.882197 74500 net.cpp:137] Memory required for data: 29902000
I0703 15:30:21.882197 74500 layer_factory.cpp:58] Creating layer relu3
I0703 15:30:21.882197 74500 net.cpp:84] Creating Layer relu3
I0703 15:30:21.882197 74500 net.cpp:406] relu3 <- conv3
I0703 15:30:21.882197 74500 net.cpp:367] relu3 -> conv3 (in-place)
I0703 15:30:21.882197 74500 net.cpp:122] Setting up relu3
I0703 15:30:21.882197 74500 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0703 15:30:21.882197 74500 net.cpp:137] Memory required for data: 31540400
I0703 15:30:21.882197 74500 layer_factory.cpp:58] Creating layer pool3
I0703 15:30:21.882197 74500 net.cpp:84] Creating Layer pool3
I0703 15:30:21.882197 74500 net.cpp:406] pool3 <- conv3
I0703 15:30:21.882197 74500 net.cpp:380] pool3 -> pool3
I0703 15:30:21.882197 74500 net.cpp:122] Setting up pool3
I0703 15:30:21.882197 74500 net.cpp:129] Top shape: 100 64 4 4 (102400)
I0703 15:30:21.882197 74500 net.cpp:137] Memory required for data: 31950000
I0703 15:30:21.882197 74500 layer_factory.cpp:58] Creating layer ip1
I0703 15:30:21.882197 74500 net.cpp:84] Creating Layer ip1
I0703 15:30:21.882197 74500 net.cpp:406] ip1 <- pool3
I0703 15:30:21.882197 74500 net.cpp:380] ip1 -> ip1
I0703 15:30:21.882197 74500 net.cpp:122] Setting up ip1
I0703 15:30:21.897799 74500 net.cpp:129] Top shape: 100 64 (6400)
I0703 15:30:21.897799 74500 net.cpp:137] Memory required for data: 31975600
I0703 15:30:21.897799 74500 layer_factory.cpp:58] Creating layer ip2
I0703 15:30:21.897799 74500 net.cpp:84] Creating Layer ip2
I0703 15:30:21.897799 74500 net.cpp:406] ip2 <- ip1
I0703 15:30:21.897799 74500 net.cpp:380] ip2 -> ip2
I0703 15:30:21.897799 74500 net.cpp:122] Setting up ip2
I0703 15:30:21.897799 74500 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:30:21.897799 74500 net.cpp:137] Memory required for data: 31979600
I0703 15:30:21.897799 74500 layer_factory.cpp:58] Creating layer ip2_ip2_0_split
I0703 15:30:21.897799 74500 net.cpp:84] Creating Layer ip2_ip2_0_split
I0703 15:30:21.897799 74500 net.cpp:406] ip2_ip2_0_split <- ip2
I0703 15:30:21.897799 74500 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_0
I0703 15:30:21.897799 74500 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_1
I0703 15:30:21.897799 74500 net.cpp:122] Setting up ip2_ip2_0_split
I0703 15:30:21.897799 74500 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:30:21.897799 74500 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:30:21.897799 74500 net.cpp:137] Memory required for data: 31987600
I0703 15:30:21.897799 74500 layer_factory.cpp:58] Creating layer accuracy
I0703 15:30:21.897799 74500 net.cpp:84] Creating Layer accuracy
I0703 15:30:21.897799 74500 net.cpp:406] accuracy <- ip2_ip2_0_split_0
I0703 15:30:21.897799 74500 net.cpp:406] accuracy <- label_cifar_1_split_0
I0703 15:30:21.897799 74500 net.cpp:380] accuracy -> accuracy
I0703 15:30:21.897799 74500 net.cpp:122] Setting up accuracy
I0703 15:30:21.897799 74500 net.cpp:129] Top shape: (1)
I0703 15:30:21.897799 74500 net.cpp:137] Memory required for data: 31987604
I0703 15:30:21.897799 74500 layer_factory.cpp:58] Creating layer loss
I0703 15:30:21.897799 74500 net.cpp:84] Creating Layer loss
I0703 15:30:21.897799 74500 net.cpp:406] loss <- ip2_ip2_0_split_1
I0703 15:30:21.897799 74500 net.cpp:406] loss <- label_cifar_1_split_1
I0703 15:30:21.897799 74500 net.cpp:380] loss -> loss
I0703 15:30:21.897799 74500 layer_factory.cpp:58] Creating layer loss
I0703 15:30:21.897799 74500 net.cpp:122] Setting up loss
I0703 15:30:21.897799 74500 net.cpp:129] Top shape: (1)
I0703 15:30:21.897799 74500 net.cpp:132] with loss weight 1
I0703 15:30:21.897799 74500 net.cpp:137] Memory required for data: 31987608
I0703 15:30:21.913401 74500 net.cpp:198] loss needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:200] accuracy does not need backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] ip2_ip2_0_split needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] ip2 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] ip1 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] pool3 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] relu3 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] conv3 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] pool2 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] relu2 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] conv2 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] relu1 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] pool1 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:198] conv1 needs backward computation.
I0703 15:30:21.913401 74500 net.cpp:200] label_cifar_1_split does not need backward computation.
I0703 15:30:21.913401 74500 net.cpp:200] cifar does not need backward computation.
I0703 15:30:21.913401 74500 net.cpp:242] This network produces output accuracy
I0703 15:30:21.913401 74500 net.cpp:242] This network produces output loss
I0703 15:30:21.913401 74500 net.cpp:255] Network initialization done.
I0703 15:30:21.913401 74500 solver.cpp:56] Solver scaffolding done.
I0703 15:30:21.913401 74500 caffe.cpp:249] Starting Optimization
I0703 15:30:21.913401 74500 solver.cpp:272] Solving CIFAR10_quick
I0703 15:30:21.913401 74500 solver.cpp:273] Learning Rate Policy: fixed
I0703 15:30:21.913401 74500 solver.cpp:330] Iteration 0, Testing net (#0)
I0703 15:30:22.287849 74500 blocking_queue.cpp:49] Waiting for data
I0703 15:30:23.161561 69056 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:30:23.192765 74500 solver.cpp:397] Test net output #0: accuracy = 0.1001
I0703 15:30:23.192765 74500 solver.cpp:397] Test net output #1: loss = 2.30252 (* 1 = 2.30252 loss)
I0703 15:30:23.223969 74500 solver.cpp:218] Iteration 0 (0 iter/s, 1.31383s/100 iters), loss = 2.30354
I0703 15:30:23.223969 74500 solver.cpp:237] Train net output #0: loss = 2.30354 (* 1 = 2.30354 loss)
I0703 15:30:23.223969 74500 sgd_solver.cpp:105] Iteration 0, lr = 0.001
I0703 15:30:26.796828 74500 solver.cpp:218] Iteration 100 (28.0669 iter/s, 3.56291s/100 iters), loss = 1.86101
I0703 15:30:27.935773 74500 solver.cpp:237] Train net output #0: loss = 1.86101 (* 1 = 1.86101 loss)
I0703 15:30:27.935773 74500 sgd_solver.cpp:105] Iteration 100, lr = 0.001
I0703 15:30:30.860167 74500 solver.cpp:218] Iteration 200 (34.2769 iter/s, 2.91742s/100 iters), loss = 1.70045
I0703 15:30:30.860167 74500 solver.cpp:237] Train net output #0: loss = 1.70045 (* 1 = 1.70045 loss)
I0703 15:30:30.860167 74500 sgd_solver.cpp:105] Iteration 200, lr = 0.001
I0703 15:30:33.818975 74500 solver.cpp:218] Iteration 300 (33.8484 iter/s, 2.95435s/100 iters), loss = 1.28248
I0703 15:30:33.818975 74500 solver.cpp:237] Train net output #0: loss = 1.28248 (* 1 = 1.28248 loss)
I0703 15:30:33.818975 74500 sgd_solver.cpp:105] Iteration 300, lr = 0.001
I0703 15:30:36.720947 74500 solver.cpp:218] Iteration 400 (34.3612 iter/s, 2.91026s/100 iters), loss = 1.3722
I0703 15:30:36.720947 74500 solver.cpp:237] Train net output #0: loss = 1.3722 (* 1 = 1.3722 loss)
I0703 15:30:36.720947 74500 sgd_solver.cpp:105] Iteration 400, lr = 0.001
I0703 15:30:39.498103 69280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:30:39.576113 74500 solver.cpp:330] Iteration 500, Testing net (#0)
I0703 15:30:40.777467 69056 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:30:40.824273 74500 solver.cpp:397] Test net output #0: accuracy = 0.5616
I0703 15:30:40.824273 74500 solver.cpp:397] Test net output #1: loss = 1.25471 (* 1 = 1.25471 loss)
I0703 15:30:40.839875 74500 solver.cpp:218] Iteration 500 (24.2912 iter/s, 4.11672s/100 iters), loss = 1.17465
I0703 15:30:40.855478 74500 solver.cpp:237] Train net output #0: loss = 1.17465 (* 1 = 1.17465 loss)
I0703 15:30:40.855478 74500 sgd_solver.cpp:105] Iteration 500, lr = 0.001
I0703 15:30:43.819857 74500 solver.cpp:218] Iteration 600 (33.6439 iter/s, 2.97231s/100 iters), loss = 1.16408
I0703 15:30:43.819857 74500 solver.cpp:237] Train net output #0: loss = 1.16408 (* 1 = 1.16408 loss)
I0703 15:30:43.819857 74500 sgd_solver.cpp:105] Iteration 600, lr = 0.001
I0703 15:30:46.768635 74500 solver.cpp:218] Iteration 700 (33.907 iter/s, 2.94924s/100 iters), loss = 1.26574
I0703 15:30:46.768635 74500 solver.cpp:237] Train net output #0: loss = 1.26574 (* 1 = 1.26574 loss)
I0703 15:30:46.768635 74500 sgd_solver.cpp:105] Iteration 700, lr = 0.001
I0703 15:30:49.725415 74500 solver.cpp:218] Iteration 800 (33.9314 iter/s, 2.94712s/100 iters), loss = 1.13423
I0703 15:30:49.725415 74500 solver.cpp:237] Train net output #0: loss = 1.13423 (* 1 = 1.13423 loss)
I0703 15:30:49.725415 74500 sgd_solver.cpp:105] Iteration 800, lr = 0.001
I0703 15:30:52.662395 74500 solver.cpp:218] Iteration 900 (34.0415 iter/s, 2.93759s/100 iters), loss = 1.02869
I0703 15:30:52.662395 74500 solver.cpp:237] Train net output #0: loss = 1.02869 (* 1 = 1.02869 loss)
I0703 15:30:52.662395 74500 sgd_solver.cpp:105] Iteration 900, lr = 0.001
I0703 15:30:55.444353 69280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:30:55.553567 74500 solver.cpp:330] Iteration 1000, Testing net (#0)
I0703 15:30:56.676911 69056 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:30:56.708115 74500 solver.cpp:397] Test net output #0: accuracy = 0.6366
I0703 15:30:56.708115 74500 solver.cpp:397] Test net output #1: loss = 1.04947 (* 1 = 1.04947 loss)
I0703 15:30:56.739320 74500 solver.cpp:218] Iteration 1000 (24.5331 iter/s, 4.07612s/100 iters), loss = 1.04028
I0703 15:30:56.739320 74500 solver.cpp:237] Train net output #0: loss = 1.04028 (* 1 = 1.04028 loss)
I0703 15:30:56.739320 74500 sgd_solver.cpp:105] Iteration 1000, lr = 0.001
I0703 15:30:59.672495 74500 solver.cpp:218] Iteration 1100 (34.167 iter/s, 2.9268s/100 iters), loss = 0.953439
I0703 15:30:59.672495 74500 solver.cpp:237] Train net output #0: loss = 0.953439 (* 1 = 0.953439 loss)
I0703 15:30:59.672495 74500 sgd_solver.cpp:105] Iteration 1100, lr = 0.001
I0703 15:31:02.605671 74500 solver.cpp:218] Iteration 1200 (34.1274 iter/s, 2.9302s/100 iters), loss = 1.00336
I0703 15:31:02.605671 74500 solver.cpp:237] Train net output #0: loss = 1.00336 (* 1 = 1.00336 loss)
I0703 15:31:02.605671 74500 sgd_solver.cpp:105] Iteration 1200, lr = 0.001
I0703 15:31:05.539847 74500 solver.cpp:218] Iteration 1300 (34.2007 iter/s, 2.92391s/100 iters), loss = 0.85533
I0703 15:31:05.539847 74500 solver.cpp:237] Train net output #0: loss = 0.85533 (* 1 = 0.85533 loss)
I0703 15:31:05.539847 74500 sgd_solver.cpp:105] Iteration 1300, lr = 0.001
I0703 15:31:08.457422 74500 solver.cpp:218] Iteration 1400 (34.2084 iter/s, 2.92326s/100 iters), loss = 0.914876
I0703 15:31:08.457422 74500 solver.cpp:237] Train net output #0: loss = 0.914876 (* 1 = 0.914876 loss)
I0703 15:31:08.457422 74500 sgd_solver.cpp:105] Iteration 1400, lr = 0.001
I0703 15:31:11.265781 69280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:31:11.359393 74500 solver.cpp:330] Iteration 1500, Testing net (#0)
I0703 15:31:12.498339 69056 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:31:12.545145 74500 solver.cpp:397] Test net output #0: accuracy = 0.6767
I0703 15:31:12.545145 74500 solver.cpp:397] Test net output #1: loss = 0.940117 (* 1 = 0.940117 loss)
I0703 15:31:12.576349 74500 solver.cpp:218] Iteration 1500 (24.3348 iter/s, 4.10933s/100 iters), loss = 0.86222
I0703 15:31:12.576349 74500 solver.cpp:237] Train net output #0: loss = 0.86222 (* 1 = 0.86222 loss)
I0703 15:31:12.576349 74500 sgd_solver.cpp:105] Iteration 1500, lr = 0.001
I0703 15:31:15.525127 74500 solver.cpp:218] Iteration 1600 (33.924 iter/s, 2.94776s/100 iters), loss = 0.885249
I0703 15:31:15.525127 74500 solver.cpp:237] Train net output #0: loss = 0.885249 (* 1 = 0.885249 loss)
I0703 15:31:15.525127 74500 sgd_solver.cpp:105] Iteration 1600, lr = 0.001
I0703 15:31:18.442701 74500 solver.cpp:218] Iteration 1700 (34.2638 iter/s, 2.91853s/100 iters), loss = 0.870494
I0703 15:31:18.442701 74500 solver.cpp:237] Train net output #0: loss = 0.870494 (* 1 = 0.870494 loss)
I0703 15:31:18.442701 74500 sgd_solver.cpp:105] Iteration 1700, lr = 0.001
I0703 15:31:21.393479 74500 solver.cpp:218] Iteration 1800 (33.8911 iter/s, 2.95063s/100 iters), loss = 0.781557
I0703 15:31:21.393479 74500 solver.cpp:237] Train net output #0: loss = 0.781557 (* 1 = 0.781557 loss)
I0703 15:31:21.393479 74500 sgd_solver.cpp:105] Iteration 1800, lr = 0.001
I0703 15:31:24.342257 74500 solver.cpp:218] Iteration 1900 (33.9231 iter/s, 2.94784s/100 iters), loss = 0.864439
I0703 15:31:24.342257 74500 solver.cpp:237] Train net output #0: loss = 0.864439 (* 1 = 0.864439 loss)
I0703 15:31:24.342257 74500 sgd_solver.cpp:105] Iteration 1900, lr = 0.001
I0703 15:31:27.135015 69280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:31:27.228627 74500 solver.cpp:330] Iteration 2000, Testing net (#0)
I0703 15:31:28.351971 69056 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:31:28.383175 74500 solver.cpp:397] Test net output #0: accuracy = 0.7004
I0703 15:31:28.383175 74500 solver.cpp:397] Test net output #1: loss = 0.882688 (* 1 = 0.882688 loss)
I0703 15:31:28.414379 74500 solver.cpp:218] Iteration 2000 (24.6233 iter/s, 4.06119s/100 iters), loss = 0.774207
I0703 15:31:28.414379 74500 solver.cpp:237] Train net output #0: loss = 0.774207 (* 1 = 0.774207 loss)
I0703 15:31:28.414379 74500 sgd_solver.cpp:105] Iteration 2000, lr = 0.001
I0703 15:31:31.331954 74500 solver.cpp:218] Iteration 2100 (34.2834 iter/s, 2.91686s/100 iters), loss = 0.876714
I0703 15:31:31.331954 74500 solver.cpp:237] Train net output #0: loss = 0.876714 (* 1 = 0.876714 loss)
I0703 15:31:31.331954 74500 sgd_solver.cpp:105] Iteration 2100, lr = 0.001
I0703 15:31:34.265130 74500 solver.cpp:218] Iteration 2200 (34.1405 iter/s, 2.92907s/100 iters), loss = 0.769951
I0703 15:31:34.265130 74500 solver.cpp:237] Train net output #0: loss = 0.769951 (* 1 = 0.769951 loss)
I0703 15:31:34.265130 74500 sgd_solver.cpp:105] Iteration 2200, lr = 0.001
I0703 15:31:37.200306 74500 solver.cpp:218] Iteration 2300 (34.0337 iter/s, 2.93826s/100 iters), loss = 0.734565
I0703 15:31:37.200306 74500 solver.cpp:237] Train net output #0: loss = 0.734565 (* 1 = 0.734565 loss)
I0703 15:31:37.200306 74500 sgd_solver.cpp:105] Iteration 2300, lr = 0.001
I0703 15:31:40.143292 74500 solver.cpp:218] Iteration 2400 (34.0256 iter/s, 2.93896s/100 iters), loss = 0.838753
I0703 15:31:40.143292 74500 solver.cpp:237] Train net output #0: loss = 0.838753 (* 1 = 0.838753 loss)
I0703 15:31:40.143292 74500 sgd_solver.cpp:105] Iteration 2400, lr = 0.001
I0703 15:31:42.935456 69280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:31:43.029068 74500 solver.cpp:330] Iteration 2500, Testing net (#0)
I0703 15:31:44.155616 69056 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:31:44.202422 74500 solver.cpp:397] Test net output #0: accuracy = 0.7105
I0703 15:31:44.202422 74500 solver.cpp:397] Test net output #1: loss = 0.857614 (* 1 = 0.857614 loss)
I0703 15:31:44.233626 74500 solver.cpp:218] Iteration 2500 (24.4966 iter/s, 4.0822s/100 iters), loss = 0.727542
I0703 15:31:44.233626 74500 solver.cpp:237] Train net output #0: loss = 0.727542 (* 1 = 0.727542 loss)
I0703 15:31:44.233626 74500 sgd_solver.cpp:105] Iteration 2500, lr = 0.001
I0703 15:31:47.220022 74500 solver.cpp:218] Iteration 2600 (33.5608 iter/s, 2.97967s/100 iters), loss = 0.876518
I0703 15:31:47.220022 74500 solver.cpp:237] Train net output #0: loss = 0.876518 (* 1 = 0.876518 loss)
I0703 15:31:47.220022 74500 sgd_solver.cpp:105] Iteration 2600, lr = 0.001
I0703 15:31:50.185408 74500 solver.cpp:218] Iteration 2700 (33.7071 iter/s, 2.96673s/100 iters), loss = 0.695705
I0703 15:31:50.185408 74500 solver.cpp:237] Train net output #0: loss = 0.695705 (* 1 = 0.695705 loss)
I0703 15:31:50.185408 74500 sgd_solver.cpp:105] Iteration 2700, lr = 0.001
I0703 15:31:53.121584 74500 solver.cpp:218] Iteration 2800 (34.1462 iter/s, 2.92858s/100 iters), loss = 0.645049
I0703 15:31:53.121584 74500 solver.cpp:237] Train net output #0: loss = 0.645049 (* 1 = 0.645049 loss)
I0703 15:31:53.121584 74500 sgd_solver.cpp:105] Iteration 2800, lr = 0.001
I0703 15:31:56.058043 74500 solver.cpp:218] Iteration 2900 (33.7472 iter/s, 2.96321s/100 iters), loss = 0.787643
I0703 15:31:56.059043 74500 solver.cpp:237] Train net output #0: loss = 0.787643 (* 1 = 0.787643 loss)
I0703 15:31:56.059043 74500 sgd_solver.cpp:105] Iteration 2900, lr = 0.001
I0703 15:31:58.852406 69280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:31:58.961619 74500 solver.cpp:330] Iteration 3000, Testing net (#0)
I0703 15:32:00.069361 69056 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:32:00.100565 74500 solver.cpp:397] Test net output #0: accuracy = 0.7166
I0703 15:32:00.100565 74500 solver.cpp:397] Test net output #1: loss = 0.846576 (* 1 = 0.846576 loss)
I0703 15:32:00.131769 74500 solver.cpp:218] Iteration 3000 (24.6666 iter/s, 4.05407s/100 iters), loss = 0.718753
I0703 15:32:00.131769 74500 solver.cpp:237] Train net output #0: loss = 0.718753 (* 1 = 0.718753 loss)
I0703 15:32:00.131769 74500 sgd_solver.cpp:105] Iteration 3000, lr = 0.001
I0703 15:32:03.071969 74500 solver.cpp:218] Iteration 3100 (34.124 iter/s, 2.93049s/100 iters), loss = 0.82852
I0703 15:32:03.071969 74500 solver.cpp:237] Train net output #0: loss = 0.82852 (* 1 = 0.82852 loss)
I0703 15:32:03.071969 74500 sgd_solver.cpp:105] Iteration 3100, lr = 0.001
I0703 15:32:05.989543 74500 solver.cpp:218] Iteration 3200 (34.2588 iter/s, 2.91896s/100 iters), loss = 0.659379
I0703 15:32:05.989543 74500 solver.cpp:237] Train net output #0: loss = 0.659379 (* 1 = 0.659379 loss)
I0703 15:32:05.989543 74500 sgd_solver.cpp:105] Iteration 3200, lr = 0.001
I0703 15:32:08.915119 74500 solver.cpp:218] Iteration 3300 (34.2618 iter/s, 2.9187s/100 iters), loss = 0.602391
I0703 15:32:08.915119 74500 solver.cpp:237] Train net output #0: loss = 0.602391 (* 1 = 0.602391 loss)
I0703 15:32:08.915119 74500 sgd_solver.cpp:105] Iteration 3300, lr = 0.001
I0703 15:32:11.832693 74500 solver.cpp:218] Iteration 3400 (34.3215 iter/s, 2.91363s/100 iters), loss = 0.76891
I0703 15:32:11.832693 74500 solver.cpp:237] Train net output #0: loss = 0.76891 (* 1 = 0.76891 loss)
I0703 15:32:11.832693 74500 sgd_solver.cpp:105] Iteration 3400, lr = 0.001
I0703 15:32:14.625654 69280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:32:14.734869 74500 solver.cpp:330] Iteration 3500, Testing net (#0)
I0703 15:32:15.875815 69056 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:32:15.922621 74500 solver.cpp:397] Test net output #0: accuracy = 0.7107
I0703 15:32:15.922621 74500 solver.cpp:397] Test net output #1: loss = 0.856881 (* 1 = 0.856881 loss)
I0703 15:32:15.953825 74500 solver.cpp:218] Iteration 3500 (24.301 iter/s, 4.11505s/100 iters), loss = 0.70127
I0703 15:32:15.953825 74500 solver.cpp:237] Train net output #0: loss = 0.70127 (* 1 = 0.70127 loss)
I0703 15:32:15.953825 74500 sgd_solver.cpp:105] Iteration 3500, lr = 0.001
I0703 15:32:18.918205 74500 solver.cpp:218] Iteration 3600 (33.7011 iter/s, 2.96727s/100 iters), loss = 0.781694
I0703 15:32:18.918205 74500 solver.cpp:237] Train net output #0: loss = 0.781694 (* 1 = 0.781694 loss)
I0703 15:32:18.918205 74500 sgd_solver.cpp:105] Iteration 3600, lr = 0.001
I0703 15:32:21.916862 74500 solver.cpp:218] Iteration 3700 (33.3442 iter/s, 2.99902s/100 iters), loss = 0.649548
I0703 15:32:21.916862 74500 solver.cpp:237] Train net output #0: loss = 0.649548 (* 1 = 0.649548 loss)
I0703 15:32:21.916862 74500 sgd_solver.cpp:105] Iteration 3700, lr = 0.001
I0703 15:32:24.894482 74500 solver.cpp:218] Iteration 3800 (33.7213 iter/s, 2.96548s/100 iters), loss = 0.556709
I0703 15:32:24.894482 74500 solver.cpp:237] Train net output #0: loss = 0.556709 (* 1 = 0.556709 loss)
I0703 15:32:24.894482 74500 sgd_solver.cpp:105] Iteration 3800, lr = 0.001
I0703 15:32:27.817062 74500 solver.cpp:218] Iteration 3900 (33.928 iter/s, 2.94742s/100 iters), loss = 0.735268
I0703 15:32:27.818063 74500 solver.cpp:237] Train net output #0: loss = 0.735268 (* 1 = 0.735268 loss)
I0703 15:32:27.818063 74500 sgd_solver.cpp:105] Iteration 3900, lr = 0.001
I0703 15:32:30.664855 69280 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:32:30.736066 74500 solver.cpp:447] Snapshotting to binary proto file examples/cifar10/cifar10_quick_iter_4000.caffemodel
I0703 15:32:30.764071 74500 sgd_solver.cpp:273] Snapshotting solver state to binary proto file examples/cifar10/cifar10_quick_iter_4000.solverstate
I0703 15:32:30.779074 74500 solver.cpp:310] Iteration 4000, loss = 0.655316
I0703 15:32:30.780074 74500 solver.cpp:330] Iteration 4000, Testing net (#0)
I0703 15:32:31.939442 69056 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:32:31.970646 74500 solver.cpp:397] Test net output #0: accuracy = 0.7113
I0703 15:32:31.970646 74500 solver.cpp:397] Test net output #1: loss = 0.85738 (* 1 = 0.85738 loss)
I0703 15:32:31.970646 74500 solver.cpp:315] Optimization Done.
I0703 15:32:31.970646 74500 caffe.cpp:260] Optimization Done.

D:\ws_caffe\caffe>

 

--------------------------------------------------------------------------------------------------------------------------

caffe test -model examples/cifar10/cifar10_quick_train_test.prototxt -weights examples/cifar10/cifar10_quick_iter_5000.caffemodel.h5 -iterations 100

 I0703 15:36:58.857549 67740 net.cpp:367] relu2 -> conv2 (in-place)

I0703 15:36:58.857549 67740 net.cpp:122] Setting up relu2
I0703 15:36:58.857549 67740 net.cpp:129] Top shape: 100 32 16 16 (819200)
I0703 15:36:58.857549 67740 net.cpp:137] Memory required for data: 27444400
I0703 15:36:58.857549 67740 layer_factory.cpp:58] Creating layer pool2
I0703 15:36:58.857549 67740 net.cpp:84] Creating Layer pool2
I0703 15:36:58.857549 67740 net.cpp:406] pool2 <- conv2
I0703 15:36:58.857549 67740 net.cpp:380] pool2 -> pool2
I0703 15:36:58.857549 67740 net.cpp:122] Setting up pool2
I0703 15:36:58.857549 67740 net.cpp:129] Top shape: 100 32 8 8 (204800)
I0703 15:36:58.857549 67740 net.cpp:137] Memory required for data: 28263600
I0703 15:36:58.857549 67740 layer_factory.cpp:58] Creating layer conv3
I0703 15:36:58.857549 67740 net.cpp:84] Creating Layer conv3
I0703 15:36:58.857549 67740 net.cpp:406] conv3 <- pool2
I0703 15:36:58.857549 67740 net.cpp:380] conv3 -> conv3
I0703 15:36:58.873150 67740 net.cpp:122] Setting up conv3
I0703 15:36:58.873150 67740 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0703 15:36:58.873150 67740 net.cpp:137] Memory required for data: 29902000
I0703 15:36:58.873150 67740 layer_factory.cpp:58] Creating layer relu3
I0703 15:36:58.873150 67740 net.cpp:84] Creating Layer relu3
I0703 15:36:58.873150 67740 net.cpp:406] relu3 <- conv3
I0703 15:36:58.873150 67740 net.cpp:367] relu3 -> conv3 (in-place)
I0703 15:36:58.873150 67740 net.cpp:122] Setting up relu3
I0703 15:36:58.873150 67740 net.cpp:129] Top shape: 100 64 8 8 (409600)
I0703 15:36:58.873150 67740 net.cpp:137] Memory required for data: 31540400
I0703 15:36:58.873150 67740 layer_factory.cpp:58] Creating layer pool3
I0703 15:36:58.873150 67740 net.cpp:84] Creating Layer pool3
I0703 15:36:58.873150 67740 net.cpp:406] pool3 <- conv3
I0703 15:36:58.873150 67740 net.cpp:380] pool3 -> pool3
I0703 15:36:58.873150 67740 net.cpp:122] Setting up pool3
I0703 15:36:58.873150 67740 net.cpp:129] Top shape: 100 64 4 4 (102400)
I0703 15:36:58.873150 67740 net.cpp:137] Memory required for data: 31950000
I0703 15:36:58.873150 67740 layer_factory.cpp:58] Creating layer ip1
I0703 15:36:58.873150 67740 net.cpp:84] Creating Layer ip1
I0703 15:36:58.873150 67740 net.cpp:406] ip1 <- pool3
I0703 15:36:58.873150 67740 net.cpp:380] ip1 -> ip1
I0703 15:36:58.873150 67740 net.cpp:122] Setting up ip1
I0703 15:36:58.888752 67740 net.cpp:129] Top shape: 100 64 (6400)
I0703 15:36:58.888752 67740 net.cpp:137] Memory required for data: 31975600
I0703 15:36:58.888752 67740 layer_factory.cpp:58] Creating layer ip2
I0703 15:36:58.888752 67740 net.cpp:84] Creating Layer ip2
I0703 15:36:58.888752 67740 net.cpp:406] ip2 <- ip1
I0703 15:36:58.888752 67740 net.cpp:380] ip2 -> ip2
I0703 15:36:58.888752 67740 net.cpp:122] Setting up ip2
I0703 15:36:58.888752 67740 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:36:58.888752 67740 net.cpp:137] Memory required for data: 31979600
I0703 15:36:58.888752 67740 layer_factory.cpp:58] Creating layer ip2_ip2_0_split
I0703 15:36:58.888752 67740 net.cpp:84] Creating Layer ip2_ip2_0_split
I0703 15:36:58.888752 67740 net.cpp:406] ip2_ip2_0_split <- ip2
I0703 15:36:58.888752 67740 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_0
I0703 15:36:58.888752 67740 net.cpp:380] ip2_ip2_0_split -> ip2_ip2_0_split_1
I0703 15:36:58.888752 67740 net.cpp:122] Setting up ip2_ip2_0_split
I0703 15:36:58.888752 67740 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:36:58.888752 67740 net.cpp:129] Top shape: 100 10 (1000)
I0703 15:36:58.888752 67740 net.cpp:137] Memory required for data: 31987600
I0703 15:36:58.888752 67740 layer_factory.cpp:58] Creating layer accuracy
I0703 15:36:58.888752 67740 net.cpp:84] Creating Layer accuracy
I0703 15:36:58.888752 67740 net.cpp:406] accuracy <- ip2_ip2_0_split_0
I0703 15:36:58.888752 67740 net.cpp:406] accuracy <- label_cifar_1_split_0
I0703 15:36:58.888752 67740 net.cpp:380] accuracy -> accuracy
I0703 15:36:58.888752 67740 net.cpp:122] Setting up accuracy
I0703 15:36:58.888752 67740 net.cpp:129] Top shape: (1)
I0703 15:36:58.888752 67740 net.cpp:137] Memory required for data: 31987604
I0703 15:36:58.888752 67740 layer_factory.cpp:58] Creating layer loss
I0703 15:36:58.888752 67740 net.cpp:84] Creating Layer loss
I0703 15:36:58.888752 67740 net.cpp:406] loss <- ip2_ip2_0_split_1
I0703 15:36:58.888752 67740 net.cpp:406] loss <- label_cifar_1_split_1
I0703 15:36:58.888752 67740 net.cpp:380] loss -> loss
I0703 15:36:58.888752 67740 layer_factory.cpp:58] Creating layer loss
I0703 15:36:58.888752 67740 net.cpp:122] Setting up loss
I0703 15:36:58.888752 67740 net.cpp:129] Top shape: (1)
I0703 15:36:58.888752 67740 net.cpp:132] with loss weight 1
I0703 15:36:58.888752 67740 net.cpp:137] Memory required for data: 31987608
I0703 15:36:58.888752 67740 net.cpp:198] loss needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:200] accuracy does not need backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] ip2_ip2_0_split needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] ip2 needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] ip1 needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] pool3 needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] relu3 needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] conv3 needs backward computation.
I0703 15:36:58.888752 67740 net.cpp:198] pool2 needs backward computation.
I0703 15:36:58.904353 67740 net.cpp:198] relu2 needs backward computation.
I0703 15:36:58.904353 67740 net.cpp:198] conv2 needs backward computation.
I0703 15:36:58.904353 67740 net.cpp:198] relu1 needs backward computation.
I0703 15:36:58.904353 67740 net.cpp:198] pool1 needs backward computation.
I0703 15:36:58.904353 67740 net.cpp:198] conv1 needs backward computation.
I0703 15:36:58.904353 67740 net.cpp:200] label_cifar_1_split does not need backward computation.
I0703 15:36:58.904353 67740 net.cpp:200] cifar does not need backward computation.
I0703 15:36:58.904353 67740 net.cpp:242] This network produces output accuracy
I0703 15:36:58.904353 67740 net.cpp:242] This network produces output loss
I0703 15:36:58.904353 67740 net.cpp:255] Network initialization done.
I0703 15:36:58.951158 67740 hdf5.cpp:32] Datatype class: H5T_FLOAT
I0703 15:36:58.951158 67740 caffe.cpp:291] Running for 100 iterations.
I0703 15:36:59.341192 67740 caffe.cpp:314] Batch 0, accuracy = 0.79
I0703 15:36:59.341192 67740 caffe.cpp:314] Batch 0, loss = 0.650074
I0703 15:36:59.731227 67740 caffe.cpp:314] Batch 1, accuracy = 0.74
I0703 15:36:59.731227 67740 caffe.cpp:314] Batch 1, loss = 0.732871
I0703 15:37:00.105661 67740 caffe.cpp:314] Batch 2, accuracy = 0.75
I0703 15:37:00.105661 67740 caffe.cpp:314] Batch 2, loss = 0.75108
I0703 15:37:00.589304 67740 caffe.cpp:314] Batch 3, accuracy = 0.74
I0703 15:37:00.589304 67740 caffe.cpp:314] Batch 3, loss = 0.827869
I0703 15:37:00.994941 67740 caffe.cpp:314] Batch 4, accuracy = 0.71
I0703 15:37:00.994941 67740 caffe.cpp:314] Batch 4, loss = 0.783882
I0703 15:37:01.306969 67740 caffe.cpp:314] Batch 5, accuracy = 0.8
I0703 15:37:01.306969 67740 caffe.cpp:314] Batch 5, loss = 0.478434
I0703 15:37:01.618998 67740 caffe.cpp:314] Batch 6, accuracy = 0.78
I0703 15:37:01.618998 67740 caffe.cpp:314] Batch 6, loss = 0.629121
I0703 15:37:02.305459 67740 caffe.cpp:314] Batch 7, accuracy = 0.73
I0703 15:37:02.305459 67740 caffe.cpp:314] Batch 7, loss = 0.900917
I0703 15:37:02.617486 67740 caffe.cpp:314] Batch 8, accuracy = 0.75
I0703 15:37:02.617486 67740 caffe.cpp:314] Batch 8, loss = 0.756295
I0703 15:37:02.992920 67740 caffe.cpp:314] Batch 9, accuracy = 0.79
I0703 15:37:02.992920 67740 caffe.cpp:314] Batch 9, loss = 0.72793
I0703 15:37:03.304949 67740 caffe.cpp:314] Batch 10, accuracy = 0.83
I0703 15:37:03.304949 67740 caffe.cpp:314] Batch 10, loss = 0.678172
I0703 15:37:04.005412 67740 caffe.cpp:314] Batch 11, accuracy = 0.75
I0703 15:37:04.005412 67740 caffe.cpp:314] Batch 11, loss = 0.737166
I0703 15:37:04.333041 67740 caffe.cpp:314] Batch 12, accuracy = 0.76
I0703 15:37:04.333041 67740 caffe.cpp:314] Batch 12, loss = 0.590975
I0703 15:37:04.661671 67740 caffe.cpp:314] Batch 13, accuracy = 0.77
I0703 15:37:04.661671 67740 caffe.cpp:314] Batch 13, loss = 0.638049
I0703 15:37:05.136315 67740 caffe.cpp:314] Batch 14, accuracy = 0.77
I0703 15:37:05.136315 67740 caffe.cpp:314] Batch 14, loss = 0.64608
I0703 15:37:05.741775 67740 caffe.cpp:314] Batch 15, accuracy = 0.75
I0703 15:37:05.741775 67740 caffe.cpp:314] Batch 15, loss = 0.745028
I0703 15:37:06.101608 67740 caffe.cpp:314] Batch 16, accuracy = 0.78
I0703 15:37:06.101608 67740 caffe.cpp:314] Batch 16, loss = 0.783791
I0703 15:37:06.430238 67740 caffe.cpp:314] Batch 17, accuracy = 0.77
I0703 15:37:06.430238 67740 caffe.cpp:314] Batch 17, loss = 0.694237
I0703 15:37:06.835873 67740 caffe.cpp:314] Batch 18, accuracy = 0.76
I0703 15:37:06.835873 67740 caffe.cpp:314] Batch 18, loss = 0.8077
I0703 15:37:07.335119 67740 caffe.cpp:314] Batch 19, accuracy = 0.74
I0703 15:37:07.335119 67740 caffe.cpp:314] Batch 19, loss = 0.816296
I0703 15:37:07.647146 67740 caffe.cpp:314] Batch 20, accuracy = 0.76
I0703 15:37:07.647146 67740 caffe.cpp:314] Batch 20, loss = 0.778361
I0703 15:37:07.959174 67740 caffe.cpp:314] Batch 21, accuracy = 0.74
I0703 15:37:07.959174 67740 caffe.cpp:314] Batch 21, loss = 0.733748
I0703 15:37:08.396013 67740 caffe.cpp:314] Batch 22, accuracy = 0.78
I0703 15:37:08.396013 67740 caffe.cpp:314] Batch 22, loss = 0.767683
I0703 15:37:08.942062 67740 caffe.cpp:314] Batch 23, accuracy = 0.72
I0703 15:37:08.942062 67740 caffe.cpp:314] Batch 23, loss = 0.860217
I0703 15:37:09.254091 67740 caffe.cpp:314] Batch 24, accuracy = 0.75
I0703 15:37:09.254091 67740 caffe.cpp:314] Batch 24, loss = 0.902482
I0703 15:37:09.566118 67740 caffe.cpp:314] Batch 25, accuracy = 0.68
I0703 15:37:09.566118 67740 caffe.cpp:314] Batch 25, loss = 1.06045
I0703 15:37:10.065363 67740 caffe.cpp:314] Batch 26, accuracy = 0.81
I0703 15:37:10.065363 67740 caffe.cpp:314] Batch 26, loss = 0.599207
I0703 15:37:10.580209 67740 caffe.cpp:314] Batch 27, accuracy = 0.76
I0703 15:37:10.580209 67740 caffe.cpp:314] Batch 27, loss = 0.750922
I0703 15:37:10.939041 67740 caffe.cpp:314] Batch 28, accuracy = 0.78
I0703 15:37:10.939041 67740 caffe.cpp:314] Batch 28, loss = 0.669347
I0703 15:37:11.251070 67740 caffe.cpp:314] Batch 29, accuracy = 0.8
I0703 15:37:11.251070 67740 caffe.cpp:314] Batch 29, loss = 0.742157
I0703 15:37:11.563097 67740 caffe.cpp:314] Batch 30, accuracy = 0.74
I0703 15:37:11.563097 67740 caffe.cpp:314] Batch 30, loss = 0.699957
I0703 15:37:12.202755 67740 caffe.cpp:314] Batch 31, accuracy = 0.74
I0703 15:37:12.202755 67740 caffe.cpp:314] Batch 31, loss = 0.771317
I0703 15:37:12.514783 67740 caffe.cpp:314] Batch 32, accuracy = 0.77
I0703 15:37:12.514783 67740 caffe.cpp:314] Batch 32, loss = 0.717153
I0703 15:37:12.858013 67740 caffe.cpp:314] Batch 33, accuracy = 0.78
I0703 15:37:12.858013 67740 caffe.cpp:314] Batch 33, loss = 0.695417
I0703 15:37:13.185643 67740 caffe.cpp:314] Batch 34, accuracy = 0.66
I0703 15:37:13.185643 67740 caffe.cpp:314] Batch 34, loss = 1.03245
I0703 15:37:13.794097 67740 caffe.cpp:314] Batch 35, accuracy = 0.78
I0703 15:37:13.794097 67740 caffe.cpp:314] Batch 35, loss = 0.822289
I0703 15:37:14.184132 67740 caffe.cpp:314] Batch 36, accuracy = 0.76
I0703 15:37:14.184132 67740 caffe.cpp:314] Batch 36, loss = 0.775723
I0703 15:37:14.480559 67740 caffe.cpp:314] Batch 37, accuracy = 0.75
I0703 15:37:14.480559 67740 caffe.cpp:314] Batch 37, loss = 0.880586
I0703 15:37:14.776986 67740 caffe.cpp:314] Batch 38, accuracy = 0.74
I0703 15:37:14.776986 67740 caffe.cpp:314] Batch 38, loss = 0.686667
I0703 15:37:15.463448 67740 caffe.cpp:314] Batch 39, accuracy = 0.73
I0703 15:37:15.463448 67740 caffe.cpp:314] Batch 39, loss = 0.649401
I0703 15:37:15.750674 67740 caffe.cpp:314] Batch 40, accuracy = 0.78
I0703 15:37:15.750674 67740 caffe.cpp:314] Batch 40, loss = 0.715338
I0703 15:37:16.156311 67740 caffe.cpp:314] Batch 41, accuracy = 0.79
I0703 15:37:16.156311 67740 caffe.cpp:314] Batch 41, loss = 0.789637
I0703 15:37:16.483939 67740 caffe.cpp:314] Batch 42, accuracy = 0.83
I0703 15:37:16.483939 67740 caffe.cpp:314] Batch 42, loss = 0.512648
I0703 15:37:17.171201 67740 caffe.cpp:314] Batch 43, accuracy = 0.79
I0703 15:37:17.171201 67740 caffe.cpp:314] Batch 43, loss = 0.754063
I0703 15:37:17.527634 67740 caffe.cpp:314] Batch 44, accuracy = 0.84
I0703 15:37:17.527634 67740 caffe.cpp:314] Batch 44, loss = 0.630253
I0703 15:37:17.839663 67740 caffe.cpp:314] Batch 45, accuracy = 0.79
I0703 15:37:17.839663 67740 caffe.cpp:314] Batch 45, loss = 0.799423
I0703 15:37:18.277501 67740 caffe.cpp:314] Batch 46, accuracy = 0.78
I0703 15:37:18.277501 67740 caffe.cpp:314] Batch 46, loss = 0.653652
I0703 15:37:18.745543 67740 caffe.cpp:314] Batch 47, accuracy = 0.77
I0703 15:37:18.745543 67740 caffe.cpp:314] Batch 47, loss = 0.707297
I0703 15:37:19.073173 67740 caffe.cpp:314] Batch 48, accuracy = 0.81
I0703 15:37:19.073173 67740 caffe.cpp:314] Batch 48, loss = 0.565867
I0703 15:37:19.385200 67740 caffe.cpp:314] Batch 49, accuracy = 0.77
I0703 15:37:19.385200 67740 caffe.cpp:314] Batch 49, loss = 0.670153
I0703 15:37:19.697228 67740 caffe.cpp:314] Batch 50, accuracy = 0.77
I0703 15:37:19.697228 67740 caffe.cpp:314] Batch 50, loss = 0.594869
I0703 15:37:20.446095 67740 caffe.cpp:314] Batch 51, accuracy = 0.76
I0703 15:37:20.446095 67740 caffe.cpp:314] Batch 51, loss = 0.700686
I0703 15:37:20.758123 67740 caffe.cpp:314] Batch 52, accuracy = 0.75
I0703 15:37:20.758123 67740 caffe.cpp:314] Batch 52, loss = 0.621944
I0703 15:37:21.101354 67740 caffe.cpp:314] Batch 53, accuracy = 0.72
I0703 15:37:21.101354 67740 caffe.cpp:314] Batch 53, loss = 0.814192
I0703 15:37:21.444586 67740 caffe.cpp:314] Batch 54, accuracy = 0.77
I0703 15:37:21.444586 67740 caffe.cpp:314] Batch 54, loss = 0.692094
I0703 15:37:22.084242 67740 caffe.cpp:314] Batch 55, accuracy = 0.71
I0703 15:37:22.084242 67740 caffe.cpp:314] Batch 55, loss = 0.841092
I0703 15:37:22.396270 67740 caffe.cpp:314] Batch 56, accuracy = 0.76
I0703 15:37:22.396270 67740 caffe.cpp:314] Batch 56, loss = 0.880208
I0703 15:37:22.708298 67740 caffe.cpp:314] Batch 57, accuracy = 0.82
I0703 15:37:22.708298 67740 caffe.cpp:314] Batch 57, loss = 0.565382
I0703 15:37:23.051529 67740 caffe.cpp:314] Batch 58, accuracy = 0.72
I0703 15:37:23.051529 67740 caffe.cpp:314] Batch 58, loss = 0.812194
I0703 15:37:23.691187 67740 caffe.cpp:314] Batch 59, accuracy = 0.76
I0703 15:37:23.691187 67740 caffe.cpp:314] Batch 59, loss = 0.812453
I0703 15:37:24.034417 67740 caffe.cpp:314] Batch 60, accuracy = 0.84
I0703 15:37:24.034417 67740 caffe.cpp:314] Batch 60, loss = 0.606914
I0703 15:37:24.362047 67740 caffe.cpp:314] Batch 61, accuracy = 0.73
I0703 15:37:24.362047 67740 caffe.cpp:314] Batch 61, loss = 0.735381
I0703 15:37:24.705278 67740 caffe.cpp:314] Batch 62, accuracy = 0.77
I0703 15:37:24.705278 67740 caffe.cpp:314] Batch 62, loss = 0.642518
I0703 15:37:25.391739 67740 caffe.cpp:314] Batch 63, accuracy = 0.81
I0703 15:37:25.391739 67740 caffe.cpp:314] Batch 63, loss = 0.617093
I0703 15:37:25.719369 67740 caffe.cpp:314] Batch 64, accuracy = 0.74
I0703 15:37:25.719369 67740 caffe.cpp:314] Batch 64, loss = 0.747036
I0703 15:37:26.109405 67740 caffe.cpp:314] Batch 65, accuracy = 0.76
I0703 15:37:26.109405 67740 caffe.cpp:314] Batch 65, loss = 0.837057
I0703 15:37:26.639852 67740 caffe.cpp:314] Batch 66, accuracy = 0.76
I0703 15:37:26.639852 67740 caffe.cpp:314] Batch 66, loss = 0.689606
I0703 15:37:27.139096 67740 caffe.cpp:314] Batch 67, accuracy = 0.77
I0703 15:37:27.139096 67740 caffe.cpp:314] Batch 67, loss = 0.770931
I0703 15:37:27.451124 67740 caffe.cpp:314] Batch 68, accuracy = 0.75
I0703 15:37:27.451124 67740 caffe.cpp:314] Batch 68, loss = 0.715115
I0703 15:37:27.778753 67740 caffe.cpp:314] Batch 69, accuracy = 0.7
I0703 15:37:27.778753 67740 caffe.cpp:314] Batch 69, loss = 0.947363
I0703 15:37:28.215593 67740 caffe.cpp:314] Batch 70, accuracy = 0.79
I0703 15:37:28.215593 67740 caffe.cpp:314] Batch 70, loss = 0.720477
I0703 15:37:28.699236 67740 caffe.cpp:314] Batch 71, accuracy = 0.81
I0703 15:37:28.699236 67740 caffe.cpp:314] Batch 71, loss = 0.644322
I0703 15:37:29.058068 67740 caffe.cpp:314] Batch 72, accuracy = 0.78
I0703 15:37:29.058068 67740 caffe.cpp:314] Batch 72, loss = 0.576543
I0703 15:37:29.401299 67740 caffe.cpp:314] Batch 73, accuracy = 0.82
I0703 15:37:29.416900 67740 caffe.cpp:314] Batch 73, loss = 0.483146
I0703 15:37:30.243775 67740 caffe.cpp:314] Batch 74, accuracy = 0.77
I0703 15:37:30.243775 67740 caffe.cpp:314] Batch 74, loss = 0.795883
I0703 15:37:30.571404 67740 caffe.cpp:314] Batch 75, accuracy = 0.78
I0703 15:37:30.571404 67740 caffe.cpp:314] Batch 75, loss = 0.626209
I0703 15:37:30.899034 67740 caffe.cpp:314] Batch 76, accuracy = 0.74
I0703 15:37:30.899034 67740 caffe.cpp:314] Batch 76, loss = 0.791658
I0703 15:37:31.242264 67740 caffe.cpp:314] Batch 77, accuracy = 0.78
I0703 15:37:31.242264 67740 caffe.cpp:314] Batch 77, loss = 0.690147
I0703 15:37:31.850719 67740 caffe.cpp:314] Batch 78, accuracy = 0.79
I0703 15:37:31.850719 67740 caffe.cpp:314] Batch 78, loss = 0.64814
I0703 15:37:32.178349 67740 caffe.cpp:314] Batch 79, accuracy = 0.74
I0703 15:37:32.178349 67740 caffe.cpp:314] Batch 79, loss = 0.790071
I0703 15:37:32.505978 67740 caffe.cpp:314] Batch 80, accuracy = 0.73
I0703 15:37:32.505978 67740 caffe.cpp:314] Batch 80, loss = 0.66705
I0703 15:37:32.833607 67740 caffe.cpp:314] Batch 81, accuracy = 0.78
I0703 15:37:32.833607 67740 caffe.cpp:314] Batch 81, loss = 0.601753
I0703 15:37:33.535670 67740 caffe.cpp:314] Batch 82, accuracy = 0.79
I0703 15:37:33.535670 67740 caffe.cpp:314] Batch 82, loss = 0.580124
I0703 15:37:33.847698 67740 caffe.cpp:314] Batch 83, accuracy = 0.74
I0703 15:37:33.847698 67740 caffe.cpp:314] Batch 83, loss = 0.791201
I0703 15:37:34.206531 67740 caffe.cpp:314] Batch 84, accuracy = 0.73
I0703 15:37:34.206531 67740 caffe.cpp:314] Batch 84, loss = 0.912428
I0703 15:37:34.721376 67740 caffe.cpp:314] Batch 85, accuracy = 0.82
I0703 15:37:34.721376 67740 caffe.cpp:314] Batch 85, loss = 0.584871
I0703 15:37:35.251824 67740 caffe.cpp:314] Batch 86, accuracy = 0.77
I0703 15:37:35.251824 67740 caffe.cpp:314] Batch 86, loss = 0.65482
I0703 15:37:35.565852 67740 caffe.cpp:314] Batch 87, accuracy = 0.8
I0703 15:37:35.565852 67740 caffe.cpp:314] Batch 87, loss = 0.684847
I0703 15:37:35.893482 67740 caffe.cpp:314] Batch 88, accuracy = 0.74
I0703 15:37:35.893482 67740 caffe.cpp:314] Batch 88, loss = 0.70748
I0703 15:37:36.706955 67740 caffe.cpp:314] Batch 89, accuracy = 0.78
I0703 15:37:36.706955 67740 caffe.cpp:314] Batch 89, loss = 0.703976
I0703 15:37:37.050186 67740 caffe.cpp:314] Batch 90, accuracy = 0.72
I0703 15:37:37.050186 67740 caffe.cpp:314] Batch 90, loss = 0.78584
I0703 15:37:37.362215 67740 caffe.cpp:314] Batch 91, accuracy = 0.82
I0703 15:37:37.362215 67740 caffe.cpp:314] Batch 91, loss = 0.571601
I0703 15:37:37.736649 67740 caffe.cpp:314] Batch 92, accuracy = 0.71
I0703 15:37:37.736649 67740 caffe.cpp:314] Batch 92, loss = 0.875672
I0703 15:37:38.532320 67740 caffe.cpp:314] Batch 93, accuracy = 0.77
I0703 15:37:38.532320 67740 caffe.cpp:314] Batch 93, loss = 0.780201
I0703 15:37:38.922354 67740 caffe.cpp:314] Batch 94, accuracy = 0.74
I0703 15:37:38.922354 67740 caffe.cpp:314] Batch 94, loss = 0.690553
I0703 15:37:39.546411 67740 caffe.cpp:314] Batch 95, accuracy = 0.82
I0703 15:37:39.546411 67740 caffe.cpp:314] Batch 95, loss = 0.664873
I0703 15:37:39.562011 69448 data_layer.cpp:73] Restarting data prefetching from start.
I0703 15:37:40.061256 67740 caffe.cpp:314] Batch 96, accuracy = 0.75
I0703 15:37:40.061256 67740 caffe.cpp:314] Batch 96, loss = 0.770992
I0703 15:37:40.404487 67740 caffe.cpp:314] Batch 97, accuracy = 0.71
I0703 15:37:40.404487 67740 caffe.cpp:314] Batch 97, loss = 0.847697
I0703 15:37:40.747719 67740 caffe.cpp:314] Batch 98, accuracy = 0.75
I0703 15:37:40.747719 67740 caffe.cpp:314] Batch 98, loss = 0.72742
I0703 15:37:41.090950 67740 caffe.cpp:314] Batch 99, accuracy = 0.75
I0703 15:37:41.090950 67740 caffe.cpp:314] Batch 99, loss = 0.656324
I0703 15:37:41.090950 67740 caffe.cpp:319] Loss: 0.724403
I0703 15:37:41.090950 67740 caffe.cpp:331] accuracy = 0.7643
I0703 15:37:41.090950 67740 caffe.cpp:331] loss = 0.724403 (* 1 = 0.724403 loss)

D:\ws_caffe\caffe>

 

---------------------------------------------------------------------------------------------------------

本文参考了:http://www.cnblogs.com/tiansha/p/6458366.html 

http://blog.csdn.net/zb1165048017/article/details/51476516

 在此感谢。

posted @ 2017-07-03 14:46  leoking01  阅读(610)  评论(0编辑  收藏  举报
#back-to-top { background-color: #00CD00; bottom: 0; box-shadow: 0 0 6px #00CD00; color: #444444; padding: 10px 10px; position: fixed; right: 50px; cursor: pointer; }