Chinese Version | English Version
The Tensorflow Lite Micro software package (TFLu) is an embedded inference framework for RT-Thread real-time operating system transplantation. It mainly solves the problem of deployment based on the Tensorflow Lite framework in embedded systems with resource constraints such as resources, power consumption, and performance.
Platforms currently planned to be optimized:
[x] Raspberry Pi 4 (Cortex A72 core, 64-bit, gcc-arm-8.3 cross tool chain): RAM 28K, Flash 690K Repository link: https://github.com/QingChuanWS/raspi4-tfliteMicro
[x] ART-Pi (STM32H750, 32-bit, gcc-arm-none-eabi-9-2019): RAM 25K, Flash 542K
[ ] Nucleo-STM32L496(STM32L496, 32-bit, gcc-arm-none-eabi-9-2019)
[ ] Kendryte K210 (K210, 64-bit, riscv architecture)
| Name | Description |
|---|---|
| docs | Document |
| examples | Tensorflow Lite Micro offical audio demo |
| tensorflow | Tensorflow Lite Micro library |
| third_party | Third party libraries on which tensorflow Lite micro depends |
Tensorflow Lite Micro package complies with the LGPLv2.1 license, please refer to the LICENSE file for details.
RT-Thread 3.0+
To use Tensorflow Lite Micro package, you need to select it in the package manager of RT-Thread. The specific path is as follows:
RT-Thread online packages
miscellaneous packages --->
[*] Tensorflow Lite Micro: a lightweight deep learning end-test inference framework for RT-Thread operating system.
Then let the RT-Thread package manager automatically update, or use the pkgs --update command to update the package to the BSP.
After successfully downloading Tensorflow Lite Micro package:
First, perform configuration through menuconfig in RT-Thread's env tool, where the configuration options in menuconfig are:
RT-Thread online packages
miscellaneous packages --->
[*] Tensorflow Lite Micro: a lightweight deep learning end-test inference framework for RT-Thread operating system.
Version(latest) --->
Select Offical Example(Enable Tensorflow Lite Micro aduio example) --->
Select Tensorflow Lite Operations Type (Using Tensorflow Lite reference operations) --->
Among them, there are two options in Select Offical Example:
(X) Enable Tensorflow Lite Micro audio example
( ) No Tensorflow Lite Micro example
Note: The audio example is an audio example carried by the official implementation. No example does not integrate the example file and only uses the Tensorflow Lite Micro standard framework.
There are two options in Select Tensorflow Lite Operations Type:
(X) Using Tensorflow Lite reference operations
( ) Using Tensorflow Lite CMSIS NN operations
Note: Reference operation is a general-purpose operator using TFLMicro (the operator is isolated from the platform, and has good portability), CMSIS NN operations is the application of the CMSIS library to accelerate the op of the platform with the ARM core. For precautions, please refer to fourth part!!
The whole framework of Tensorflow Lite Micro has more complicated functions and various APIs. Please refer to introduction.md in the document first, and then use user-guide.md to understand the basics Deep learning end-test deployment process. After having the above foundation, you can try to develop your own end-test deployment tasks.
The API manual can visit this link, which provides the current support for operators
More documents are located under /docs, Be sure to check before use
Using Tensorflow Lite CMSIS NN operations option: