
flutter_onnxruntime
Native Wrapper Flutter Plugin for ONNX Runtime
Current supported ONNX Runtime version: 1.21.0
๐ Why This Project?
flutter_onnxruntime
is a lightweight plugin that provides native wrappers for running ONNX Runtime on multiple platforms.
๐ฆ No Pre-built Libraries
Libraries are fetched directly from official repositories during installation, ensuring they are always up-to-date!
๐ก๏ธ Memory Safety
All memory management is handled in native code, reducing the risk of memory leaks.
๐ Easy Upgrades
Stay current with the latest ONNX Runtime releases without the hassle of maintaining complex generated FFI wrappers.
๐ Getting Started
Installation
Add the following dependency to your pubspec.yaml
:
dependencies:
flutter_onnxruntime: ^1.2.0
Quick Start
Example of running an addition model:
import 'package:flutter_onnxruntime/flutter_onnxruntime.dart';
// create inference session
final ort = OnnxRuntime();
final session = await ort.createSessionFromAsset('assets/models/addition_model.onnx');
// specify input with data and shape
final inputs = {
'A': await OrtValue.fromList([1, 1, 1], [3]),
'B': await OrtValue.fromList([2, 2, 2], [3])
}
// start the inference
final outputs = await session.run(inputs);
// print output data
print(await outputs['C']!.asList());
To get started with the Flutter ONNX Runtime plugin, see the API Usage Guide.
๐งช Examples
Simple Addition Model
A simple model with only one operator (Add) that takes two inputs and produces one output.
Run this example with:
cd example
flutter pub get
flutter run
Image Classification Model
A more complex model that takes an image as input and classifies it into one of the predefined categories.
Clone this repository and run the example following the repo's guidelines.
๐ Component Overview
Component | Description |
---|---|
OnnxRuntime | Main entry point for creating sessions and configuring global options |
OrtSession | Represents a loaded ML model for running inference |
OrtValue | Represents tensor data for inputs and outputs |
OrtSessionOptions | Configuration options for session creation |
OrtRunOptions | Configuration options for inference execution |
๐ง Implementation Status
Feature | Android | iOS | Linux | macOS | Windows | Web |
---|---|---|---|---|---|---|
CPU Inference | โ | โ | โ | โ | ||
Inference on Emulator | โ | โ | โ | โ | ||
GPU Inference | โ | โ | โ | โ | ||
Input/Output names | โ | โ | โ | โ | ||
Input/Output Info | โ | โ* | โ | โ* | ||
Model Metadata | โ | โ* | โ | โ* | ||
Data Type Conversion | โ | โ | โ | โ | ||
FP16 Support | โ | โ** | โ๏ธ | โ** |
โ : Complete
โ: Not supported
๐ง: Ongoing
โ๏ธ: Planned
*
: Retrieving model metadata and input/output info is not avialable in onnxruntime-objc
, only names available.
**
: Swift does not support FP16 type.
๐ ๏ธ Troubleshooting
For troubleshooting, see the troubleshooting.md file.
๐ค Contributing
Contributions to the Flutter ONNX Runtime plugin are welcome. Please see the contributing.md file for more information.
๐ Documentation
Find more information in the documentation.