树莓派3B交叉编译PaddleLite进行图像分类
链接:
https://paddle-lite.readthedocs.io/zh/latest/source_compile/compile_env.html
https://paddle-lite.readthedocs.io/zh/latest/demo_guides/x86.html
https://paddle-lite.readthedocs.io/zh/latest/source_compile/compile_linux.html
https://paddle-lite.readthedocs.io/zh/latest/demo_guides/linux_arm_demo.html
系统信息:
- ubuntu 18.04
- python 3.7
- paddle 2.0.2
- paddlelite 2.9
说明:
- 编译环境准备,安装git,cmake和交叉编译工具链
sudo apt-get update sudo apt-get install cmake git dudo apt-get install g++-arm-linux-gnueabi gcc-arm-linux-gnueabi \ g++-arm-linux-gnueabihf gcc-arm-linux-gnueabihf \ gcc-aarch64-linux-gnu g++-aarch64-linux-gnu
- 编译x86版本的Paddlelite,对转换模型进行验证
git clone https://github.com/PaddlePaddle/Paddle-Lite.git git checkout -b v2.9 cd Paddle-Lite/ rm third-party -rf ./lite/tools/build.sh --build_python=ON x86
- 安装python版paddlelite,使用paddle_lite_opt对paddle生成的模型进行转换
cd Paddle-Lite/build.lite.x86/inference_lite_lib/python/install/dist pip install paddlelite-15ee38677-cp37-cp37m-linux_x86_64.whl cd model_dir/ mkdir -p Paddle-Lite/build.lite.x86/inference_lite_lib/demo/cxx/mobilenetv1_light/model cp model.pdmodel Paddle-Lite/build.lite.x86/inference_lite_lib/demo/cxx/mobilenetv1_light/model cp model.pdiparams Paddle-Lite/build.lite.x86/inference_lite_lib/demo/cxx/mobilenetv1_light/model cd Paddle-Lite/build.lite.x86/inference_lite_lib/demo/cxx/mobilenetv1_light/model paddle_lite_opt --model_file=./model.pdmodel --param_file=./model.pdiparams --optimize_out=./model --optimize_out_type=naive_buffer --valid_targets=x86
- 编译运行mobilenetv1_light
cd addle-Lite/build.lite.x86/inference_lite_lib/demo/cxx/mobilenetv1_light sh build.sh cp ../../../third_party/mklml/lib/lib* ./ cp ../../../cxx/lib/libpaddle_light_api_shared.so ./mobilenet_light_api model/model.nb
- 交叉编译armlinux版paddlelite
cd Paddle-Lite/
./lite/tools/build_linux.sh --arch=armv7hf --with_extra=ON --with_static_lib=ON --with_cv=ON
- 使用paddle_lite_opt对paddle生成的模型进行转换
cd model_dir/ mkdir -p addle-Lite/build.lite.linux.armv7hf.gcc/inference_lite_lib.armlinux.armv7hf/demo/cxx/mobilenetv1_light/model cp model.pd* Paddle-Lite/build.lite.linux.armv7hf.gcc/inference_lite_lib.armlinux.armv7hf/demo/cxx/mobilenetv1_light/model cd Paddle-Lite/build.lite.linux.armv7hf.gcc/inference_lite_lib.armlinux.armv7hf/demo/cxx/mobilenetv1_light/model paddle_lite_opt --model_file=./model.pdmodel --param_file=./model.pdiparams --optimize_out=./model --optimize_out_type=naive_buffer --valid_targets=arm
- 编译mobilenetv1_light,然后拷贝到树莓派3B上运行
vim CMakeLists.txt set(ARMLINUX_ARCH_ABI armv7hf) add_definitions(-std=c++11 -O3 -pthread -Wl,--rpath=./) sh build.sh scp mobilenet_light_api model.nb libpaddle_light_api_shared.so pi@192.168.123.62:/home/pi/mobilenet_light_api ssh pi@192.168.123.62 export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:./ ./mobilenet_light_api model.nb
- 编译运行Paddle-Lite-Demo
git clone https://github.com/PaddlePaddle/Paddle-Lite-Demo.git cd Paddle-Lite-Demo/PaddleLite-armlinux-demo ./download_models_and_libs.sh scp *.h pi@192.168.123.62:/home/pi/yuandanfei/Paddle-Lite-Demo/PaddleLite-armlinux-demo/Paddle-Lite/include scp libpaddle_light_api_shared.so pi@192.168.123.62:/home/pi/yuandanfei/Paddle-Lite-Demo/PaddleLite-armlinux-demo/Paddle-Lite/libs/armv7hf cd Paddle-Lite-Demo/PaddleLite-armlinux-demo/image_classification_demo vim run.sh #TARGET_ARCH_ABI=armv8 # for RK3399, set to default arch abi TARGET_ARCH_ABI=armv7hf # for Raspberry Pi 3B sh run.sh