TensorFlow分布式训练

1、模型并行,in-graph replication;数据并行,between-graph replication。

 

tf.train.Supervisor

tf.train.MonitoredTrainingSession

 

参考链接:

https://github.com/tensorflow/examples/blob/master/community/en/docs/deploy/distributed.md

 

posted @ 2019-12-30 19:40  happyyoung  阅读(473)  评论(0编辑  收藏  举报