[TF Lite] Working with TensorFlow Lite on Android with C++
一、概况
会议记录:https://conferences.oreilly.com/tensorflow/tf-ca-2019/public/schedule/detail/78543 by Joe Bowser
There are many cases where developers on mobile write lower-level C++ code for their Android applications using the Android NDK, OpenCV and other technologies. Joe Bowser explores:
-
- how to use TF Lite’s C++ API on Android with existing code so the code can interact directly with TF Lite
- without having to make a round trip through Java Native Interface (JNI) and the Android subsystem,
- allowing for cleaner, more portable code so that it can even be used in iOS or other platforms.
- You’ll also discover common pitfalls when working with TFLite as a C++ library, using TFLite with OpenCV and/or Halide on Android, as well as some techniques to do integration testing to allow your tests to work in a CI/CD environment.
What you'll learn
Discover with the pros and cons of various approaches to using TensorFlow Lite in a production environment and whether using Java or C++ is the best choice for your project
二、底层优化
Ref: [ARM] Arm Compute Library for computer vision and machine learning
了解OpenCV与TFLite中的优化程度发展到了何等程度呢。
三、混合集成
之所以混合集成,是希望能最大化利用“硬件优化”。
Hardware: Coral Dev Board
Goto: https://aiyprojects.withgoogle.com/edge-tpu/
ACL C++ on Linux
/* implement */
TFLite C++ on Linux
TFLite based on OpenGL ES provides better performance, more details: [AR] TensorFlow Lite with GPU
Ref: Real Computer Vision for mobile and embedded. Part 2. Select the right tool.
- Performance: Initially this framework was created for ML inference on the embedded and low-end hardware devices. So the main resource for this library is CPU. It means that “big”-capacity models can be executed with low performance and consume a lot of battery. Also such kind of operations can lead to overheating of the phone.
With all the above disadvantages TFLite is almost the only tool which can be used for all variety of Android ARM devices. It uses all possible optimizations to run your model efficiently on-device and it can be enough for many Android ML apps.
P.S. In the experimental branch of TF Lite lib you can find GPU acceleration support through Open GL technologies. It shows good results for the latest phone models. - ML operations (layers) capability: It looks similar to iOS description. Good idea to use the Tensor Flow framework for server-side training and the official convertor.
- Hardware specifications: Even there are thousands of phone models we still have a limited amount of CPU architectures. 99 percents of the market are ARM-based gadgets. TF Lite uses union, CPU efficient instructions (such NEON) for ML inference.
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· AI与.NET技术实操系列:向量存储与相似性搜索在 .NET 中的实现
· 基于Microsoft.Extensions.AI核心库实现RAG应用
· Linux系列:如何用heaptrack跟踪.NET程序的非托管内存泄露
· 开发者必知的日志记录最佳实践
· SQL Server 2025 AI相关能力初探
· 震惊!C++程序真的从main开始吗?99%的程序员都答错了
· 【硬核科普】Trae如何「偷看」你的代码?零基础破解AI编程运行原理
· 单元测试从入门到精通
· 上周热点回顾(3.3-3.9)
· winform 绘制太阳,地球,月球 运作规律