fluid.io.load_inference_model 载入多个模型的时候会报错 -- [paddlepaddle]
摘要:将多个模型部署到同一个服务时,会出现stack错误. 原因是program为全局. 改成这样,可以解决. solved by myself. for those who need it:use a new scope for every model scope = fluid.Scope() wit
阅读全文
posted @ 2019-06-11 12:22