Status:Executing -train.py
02/09/2019 09:33:38 INFO Log level set to: INFO
Using TensorFlow backend.
02/09/2019 09:33:39 INFO Model A Directory: /home/afda/faceswap/faceswap-master/output
02/09/2019 09:33:39 INFO Model B Directory: /home/afda/faceswap/faceswap-master/plugins/extract/align
02/09/2019 09:33:39 INFO Training data directory: /home/afda/faceswap/faceswap-master/models
02/09/2019 09:33:39 INFO Loading data, this may take a while...
02/09/2019 09:33:39 INFO Starting. Press 'ENTER' to stop training and save model
02/09/2019 09:33:39 INFO Loading Model from Model_Original plugin...
02/09/2019 09:33:39 WARNING Error loading training info: No such file or directory
02/09/2019 09:33:39 WARNING Failed loading existing training data. Starting a fresh model: /home/afda
/faceswap
/faceswap-master/models
02/09/2019 09:33:39 INFO Loading Trainer from Model_Original plugin...
Exception in thread Thread-3:
Traceback (most recent call last):
File "/usr/lib/python3.6/threading.py", line 916, in _bootstrap_inner
self.run()
File "/home/afda/faceswap/faceswap-master/lib/multithreading.py", line 187, in run
for item in self.generator:
File "/home/afda/faceswap/faceswap-master/lib/training_data.py", line 23, in minibatch
assert length >= batchsize, "Number of images is lower than batch-size (Note that too few images may lead to bad
training). # images: {}, batch-size: {}".format(length, batchsize)
AssertionError: Number of images is lower than batch-size (Note that too few images may lead to bad training). #
images: 33, batch-size: 64
至少数据量为64张图片,而数据量只有33个,所以导致换脸失败,如果实在没有那么多图片可以复制一下来增加数据量欺骗tensorflow。。
换脸要求双方都有效才会效果好
而且换脸的顺序也有问题。。。input1和input2之间到底是怎么回事
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· 记一次.NET内存居高不下排查解决与启示
· 探究高空视频全景AR技术的实现原理
· 理解Rust引用及其生命周期标识(上)
· 浏览器原生「磁吸」效果!Anchor Positioning 锚点定位神器解析
· 没有源码,如何修改代码逻辑?
· 分享4款.NET开源、免费、实用的商城系统
· 全程不用写代码,我用AI程序员写了一个飞机大战
· MongoDB 8.0这个新功能碉堡了,比商业数据库还牛
· 白话解读 Dapr 1.15:你的「微服务管家」又秀新绝活了
· 上周热点回顾(2.24-3.2)