LSTM

PERFORMANCE ON TEST SET: Batch Loss = 0.6423985362052917, Accuracy = 0.9051185846328735
Training iter #584292:   Batch Loss = 0.357018, Accuracy = 0.9660000205039978
PERFORMANCE ON TEST SET: Batch Loss = 0.6445194482803345, Accuracy = 0.9026217460632324
Training iter #584296:   Batch Loss = 0.371959, Accuracy = 0.9516000151634216
PERFORMANCE ON TEST SET: Batch Loss = 0.6355495452880859, Accuracy = 0.9136080145835876
Training iter #584300:   Batch Loss = 0.379772, Accuracy = 0.9495999813079834
PERFORMANCE ON TEST SET: Batch Loss = 0.6288002133369446, Accuracy = 0.9158551692962646
Training iter #584304:   Batch Loss = 0.364809, Accuracy = 0.9643999934196472
PERFORMANCE ON TEST SET: Batch Loss = 0.630466878414154, Accuracy = 0.9111111164093018
Training iter #584308:   Batch Loss = 0.362532, Accuracy = 0.9664000272750854
PERFORMANCE ON TEST SET: Batch Loss = 0.6333655714988708, Accuracy = 0.9141073822975159
Training iter #584312:   Batch Loss = 0.367023, Accuracy = 0.9607999920845032
PERFORMANCE ON TEST SET: Batch Loss = 0.6294339895248413, Accuracy = 0.9158551692962646
Training iter #584316:   Batch Loss = 0.358729, Accuracy = 0.9696000218391418
PERFORMANCE ON TEST SET: Batch Loss = 0.6266082525253296, Accuracy = 0.9186017513275146
Training iter #584320:   Batch Loss = 0.364320, Accuracy = 0.9652000069618225
PERFORMANCE ON TEST SET: Batch Loss = 0.6318942904472351, Accuracy = 0.9181023836135864
Training iter #584324:   Batch Loss = 0.362601, Accuracy = 0.9643999934196472
PERFORMANCE ON TEST SET: Batch Loss = 0.6292382478713989, Accuracy = 0.9161048531532288
Training iter #584328:   Batch Loss = 0.354073, Accuracy = 0.967199981212616
PERFORMANCE ON TEST SET: Batch Loss = 0.6256863474845886, Accuracy = 0.9181023836135864
Training iter #584332:   Batch Loss = 0.358450, Accuracy = 0.9692000150680542
PERFORMANCE ON TEST SET: Batch Loss = 0.6302834153175354, Accuracy = 0.9198501706123352
Training iter #584336:   Batch Loss = 0.352892, Accuracy = 0.9652000069618225
PERFORMANCE ON TEST SET: Batch Loss = 0.6302400827407837, Accuracy = 0.9161048531532288
Training iter #584340:   Batch Loss = 0.352440, Accuracy = 0.9700000286102295
PERFORMANCE ON TEST SET: Batch Loss = 0.6251929998397827, Accuracy = 0.9186017513275146
Training iter #584344:   Batch Loss = 0.351044, Accuracy = 0.9667999744415283
PERFORMANCE ON TEST SET: Batch Loss = 0.6341428756713867, Accuracy = 0.9141073822975159
Training iter #584348:   Batch Loss = 0.351749, Accuracy = 0.9652000069618225
PERFORMANCE ON TEST SET: Batch Loss = 0.6319619417190552, Accuracy = 0.9173533320426941
Training iter #584352:   Batch Loss = 0.349911, Accuracy = 0.9679999947547913
PERFORMANCE ON TEST SET: Batch Loss = 0.6262838840484619, Accuracy = 0.9183520674705505
Training iter #584356:   Batch Loss = 0.351974, Accuracy = 0.9715999960899353
PERFORMANCE ON TEST SET: Batch Loss = 0.6314669847488403, Accuracy = 0.9158551692962646
Training iter #584360:   Batch Loss = 0.357018, Accuracy = 0.9656000137329102
PERFORMANCE ON TEST SET: Batch Loss = 0.6328622102737427, Accuracy = 0.9163545370101929
Training iter #584364:   Batch Loss = 0.360940, Accuracy = 0.9643999934196472
PERFORMANCE ON TEST SET: Batch Loss = 0.6280090808868408, Accuracy = 0.91710364818573
Training iter #584368:   Batch Loss = 0.351369, Accuracy = 0.9692000150680542
PERFORMANCE ON TEST SET: Batch Loss = 0.6339057683944702, Accuracy = 0.91435706615448
Training iter #584372:   Batch Loss = 0.354622, Accuracy = 0.9652000069618225
PERFORMANCE ON TEST SET: Batch Loss = 0.6330746412277222, Accuracy = 0.9151061177253723
Training iter #584376:   Batch Loss = 0.367034, Accuracy = 0.9595999717712402
PERFORMANCE ON TEST SET: Batch Loss = 0.6285595893859863, Accuracy = 0.9183520674705505
Training iter #584380:   Batch Loss = 0.348664, Accuracy = 0.9728000164031982
PERFORMANCE ON TEST SET: Batch Loss = 0.6320279836654663, Accuracy = 0.916604220867157
Training iter #584384:   Batch Loss = 0.358708, Accuracy = 0.9624000191688538
PERFORMANCE ON TEST SET: Batch Loss = 0.6281108856201172, Accuracy = 0.9208489656448364
Training iter #584388:   Batch Loss = 0.358550, Accuracy = 0.9643999934196472
PERFORMANCE ON TEST SET: Batch Loss = 0.6256601214408875, Accuracy = 0.9186017513275146
Training iter #584392:   Batch Loss = 0.347133, Accuracy = 0.9724000096321106
PERFORMANCE ON TEST SET: Batch Loss = 0.6251524090766907, Accuracy = 0.919350802898407
Training iter #584396:   Batch Loss = 0.354937, Accuracy = 0.9660000205039978
PERFORMANCE ON TEST SET: Batch Loss = 0.6278438568115234, Accuracy = 0.9200998544692993
Training iter #584400:   Batch Loss = 0.360063, Accuracy = 0.9656000137329102
PERFORMANCE ON TEST SET: Batch Loss = 0.6272940635681152, Accuracy = 0.9205992221832275
Training iter #584404:   Batch Loss = 0.350955, Accuracy = 0.9692000150680542
PERFORMANCE ON TEST SET: Batch Loss = 0.6247848272323608, Accuracy = 0.9176030158996582
Training iter #584408:   Batch Loss = 0.357840, Accuracy = 0.967199981212616
PERFORMANCE ON TEST SET: Batch Loss = 0.6298718452453613, Accuracy = 0.9218477010726929
Training iter #584412:   Batch Loss = 0.364055, Accuracy = 0.9643999934196472
PERFORMANCE ON TEST SET: Batch Loss = 0.6254392862319946, Accuracy = 0.9210986495018005
 
Training iter #584416:   Batch Loss = 0.351912, Accuracy = 0.9711999893188477
PERFORMANCE ON TEST SET: Batch Loss = 0.6274770498275757, Accuracy = 0.9200998544692993
Training iter #584420:   Batch Loss = 0.358040, Accuracy = 0.9696000218391418
PERFORMANCE ON TEST SET: Batch Loss = 0.6289907097816467, Accuracy = 0.922097384929657
Training iter #584424:   Batch Loss = 0.361915, Accuracy = 0.9635999798774719
PERFORMANCE ON TEST SET: Batch Loss = 0.6267648339271545, Accuracy = 0.9188514351844788
Training iter #584428:   Batch Loss = 0.351718, Accuracy = 0.9700000286102295
PERFORMANCE ON TEST SET: Batch Loss = 0.6258651614189148, Accuracy = 0.9203495383262634
Training iter #584432:   Batch Loss = 0.356733, Accuracy = 0.967199981212616
PERFORMANCE ON TEST SET: Batch Loss = 0.6291944980621338, Accuracy = 0.9200998544692993
Training iter #584436:   Batch Loss = 0.355049, Accuracy = 0.9660000205039978
PERFORMANCE ON TEST SET: Batch Loss = 0.6266140341758728, Accuracy = 0.9196004867553711
Training iter #584440:   Batch Loss = 0.344642, Accuracy = 0.9739999771118164
PERFORMANCE ON TEST SET: Batch Loss = 0.6238135099411011, Accuracy = 0.9210986495018005
Training iter #584444:   Batch Loss = 0.346299, Accuracy = 0.9724000096321106
PERFORMANCE ON TEST SET: Batch Loss = 0.6301990747451782, Accuracy = 0.9188514351844788
Training iter #584448:   Batch Loss = 0.349577, Accuracy = 0.9667999744415283
PERFORMANCE ON TEST SET: Batch Loss = 0.6253681182861328, Accuracy = 0.919350802898407
Training iter #584452:   Batch Loss = 0.349254, Accuracy = 0.9711999893188477
PERFORMANCE ON TEST SET: Batch Loss = 0.6253665089607239, Accuracy = 0.9205992221832275
Training iter #584456:   Batch Loss = 0.342836, Accuracy = 0.9739999771118164
PERFORMANCE ON TEST SET: Batch Loss = 0.6288788318634033, Accuracy = 0.9203495383262634
Training iter #584460:   Batch Loss = 0.351835, Accuracy = 0.9656000137329102
PERFORMANCE ON TEST SET: Batch Loss = 0.6269345283508301, Accuracy = 0.9213483333587646
Training iter #584464:   Batch Loss = 0.350256, Accuracy = 0.9700000286102295
PERFORMANCE ON TEST SET: Batch Loss = 0.6251749992370605, Accuracy = 0.9178526997566223
Training iter #584468:   Batch Loss = 0.346889, Accuracy = 0.9724000096321106
PERFORMANCE ON TEST SET: Batch Loss = 0.6286365985870361, Accuracy = 0.919350802898407
Training iter #584472:   Batch Loss = 0.356393, Accuracy = 0.9667999744415283
PERFORMANCE ON TEST SET: Batch Loss = 0.6249571442604065, Accuracy = 0.922097384929657
Training iter #584476:   Batch Loss = 0.358566, Accuracy = 0.9639999866485596
PERFORMANCE ON TEST SET: Batch Loss = 0.6291729807853699, Accuracy = 0.9183520674705505
Training iter #584480:   Batch Loss = 0.346657, Accuracy = 0.9735999703407288
PERFORMANCE ON TEST SET: Batch Loss = 0.6276390552520752, Accuracy = 0.9218477010726929
Training iter #584484:   Batch Loss = 0.353289, Accuracy = 0.9648000001907349
PERFORMANCE ON TEST SET: Batch Loss = 0.6270779371261597, Accuracy = 0.9208489656448364
Training iter #584488:   Batch Loss = 0.358287, Accuracy = 0.9656000137329102
PERFORMANCE ON TEST SET: Batch Loss = 0.6281952857971191, Accuracy = 0.9215980172157288
Training iter #584492:   Batch Loss = 0.342977, Accuracy = 0.9771999716758728
PERFORMANCE ON TEST SET: Batch Loss = 0.6257636547088623, Accuracy = 0.9208489656448364
Training iter #584496:   Batch Loss = 0.354498, Accuracy = 0.9660000205039978
PERFORMANCE ON TEST SET: Batch Loss = 0.6311811208724976, Accuracy = 0.9191011190414429
Training iter #584500:   Batch Loss = 0.358187, Accuracy = 0.9664000272750854
PERFORMANCE ON TEST SET: Batch Loss = 0.6262414455413818, Accuracy = 0.9215980172157288
Training iter #584504:   Batch Loss = 0.346113, Accuracy = 0.9732000231742859
PERFORMANCE ON TEST SET: Batch Loss = 0.6271432638168335, Accuracy = 0.9176030158996582
Training iter #584508:   Batch Loss = 0.360091, Accuracy = 0.9607999920845032
PERFORMANCE ON TEST SET: Batch Loss = 0.6279206871986389, Accuracy = 0.9200998544692993
Training iter #584512:   Batch Loss = 0.359824, Accuracy = 0.9692000150680542
PERFORMANCE ON TEST SET: Batch Loss = 0.6296336650848389, Accuracy = 0.916604220867157
Training iter #584516:   Batch Loss = 0.355421, Accuracy = 0.967199981212616
PERFORMANCE ON TEST SET: Batch Loss = 0.6291743516921997, Accuracy = 0.9186017513275146
Training iter #584520:   Batch Loss = 0.357501, Accuracy = 0.9631999731063843
PERFORMANCE ON TEST SET: Batch Loss = 0.6291993260383606, Accuracy = 0.9203495383262634
posted @ 2019-04-07 20:52  西北逍遥  阅读(160)  评论(0编辑  收藏  举报