lama-cleaner全部命令行参数
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 | usage: main.py [-h] [--host HOST] [--port PORT] [--config-installer] [--load-installer-config] [--installer-config INSTALLER_CONFIG] [--model {lama,ldm,zits,mat,fcf,sd1.5,cv2,manga,sd2,paint_by_example,instruct_pix2pix}] [--no-half] [--cpu-offload] [--disable-nsfw] [--sd-cpu-textencoder] [-- local -files-only] [-- enable -xformers] [--device {cuda,cpu,mps}] [--gui] [--no-gui-auto-close] [--gui-size GUI_SIZE GUI_SIZE] [--input INPUT] [--output- dir OUTPUT_DIR] [--model- dir MODEL_DIR] [--disable-model-switch] optional arguments: -h, --help show this help message and exit --host HOST --port PORT --config-installer Open config web page, mainly for windows installer (default: False) --load-installer-config Load all cmd args from installer config file (default: False) --installer-config INSTALLER_CONFIG Config file for windows installer (default: None) --model {lama,ldm,zits,mat,fcf,sd1.5,cv2,manga,sd2,paint_by_example,instruct_pix2pix} --no-half Using full precision model. If your generate result is always black or green, use this argument. (sd /paint_by_exmaple ) (default: False) --cpu-offload Offloads all models to CPU, significantly reducing vRAM usage. (sd /paint_by_example ) (default: False) --disable-nsfw Disable NSFW checker. (sd /paint_by_example ) (default: False) --sd-cpu-textencoder Run Stable Diffusion text encoder model on CPU to save GPU memory. (default: False) -- local -files-only Use local files only, not connect to Hugging Face server. (sd /paint_by_example ) (default: False) -- enable -xformers Enable xFormers optimizations. Requires xformers package has been installed. See: https: //github .com /facebookresearch/xformers (sd /paint_by_example ) (default: False) --device {cuda,cpu,mps} --gui Launch Lama Cleaner as desktop app (default: False) --no-gui-auto-close Prevent backend auto close after the GUI window closed. (default: False) --gui-size GUI_SIZE GUI_SIZE Set window size for GUI (default: [1600, 1000]) --input INPUT If input is image, it will be loaded by default. If input is directory, you can browse and select image in file manager. (default: None) --output- dir OUTPUT_DIR Result images will be saved to output directory automatically without confirmation. (default: None) --model- dir MODEL_DIR Model download directory (by setting XDG_CACHE_HOME environment variable), by default model downloaded to ~/.cache (default: /Users/cwq/ .cache) --disable-model-switch Disable model switch in frontend (default: False) |
安装 Lama Cleaner 最简单的方法是通过(支持 python 3.7 ~ 3.10)安装它
1 2 3 | # In order to use the GPU, install cuda version of pytorch first. # pip install torch==1.13.1+cu117 --extra-index-url https://download.pytorch.org/whl/cu117 pip install lama-cleaner |
如果您看完本篇感觉不错,请点击下方的【推荐】支持一下博主!
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· 阿里最新开源QwQ-32B,效果媲美deepseek-r1满血版,部署成本又又又降低了!
· SQL Server 2025 AI相关能力初探
· AI编程工具终极对决:字节Trae VS Cursor,谁才是开发者新宠?
· 开源Multi-agent AI智能体框架aevatar.ai,欢迎大家贡献代码
· Manus重磅发布:全球首款通用AI代理技术深度解析与实战指南