flutter_onnxruntime

flutter_onnxruntime

Native Wrapper Flutter Plugin for ONNX Runtime

Current supported ONNX Runtime version: 1.21.0

๐ŸŒŸ Why This Project?

flutter_onnxruntime is a lightweight plugin that provides native wrappers for running ONNX Runtime on multiple platforms.

  ๐Ÿ“ฆ No Pre-built Libraries
  Libraries are fetched directly from official repositories during installation, ensuring they are always up-to-date!

  ๐Ÿ›ก๏ธ Memory Safety
  All memory management is handled in native code, reducing the risk of memory leaks.

  ๐Ÿ”„ Easy Upgrades
  Stay current with the latest ONNX Runtime releases without the hassle of maintaining complex generated FFI wrappers.

๐Ÿš€ Getting Started

Installation

Add the following dependency to your pubspec.yaml:

dependencies:
  flutter_onnxruntime: ^1.2.0

Quick Start

Example of running an addition model:

import 'package:flutter_onnxruntime/flutter_onnxruntime.dart';

// create inference session
final ort = OnnxRuntime();
final session = await ort.createSessionFromAsset('assets/models/addition_model.onnx');

// specify input with data and shape
final inputs = {
   'A': await OrtValue.fromList([1, 1, 1], [3]),
   'B': await OrtValue.fromList([2, 2, 2], [3])
}

// start the inference
final outputs = await session.run(inputs);

// print output data
print(await outputs['C']!.asList());

To get started with the Flutter ONNX Runtime plugin, see the API Usage Guide.

๐Ÿงช Examples

Simple Addition Model

A simple model with only one operator (Add) that takes two inputs and produces one output.

Run this example with:

cd example
flutter pub get
flutter run

Image Classification Model

A more complex model that takes an image as input and classifies it into one of the predefined categories.

Clone this repository and run the example following the repo's guidelines.

๐Ÿ“Š Component Overview

Component Description
OnnxRuntime Main entry point for creating sessions and configuring global options
OrtSession Represents a loaded ML model for running inference
OrtValue Represents tensor data for inputs and outputs
OrtSessionOptions Configuration options for session creation
OrtRunOptions Configuration options for inference execution

๐Ÿšง Implementation Status

Feature Android iOS Linux macOS Windows Web
CPU Inference โœ… โœ… โœ… โœ…
Inference on Emulator โœ… โœ… โœ… โœ…
GPU Inference โœ… โœ… โœ… โœ…
Input/Output names โœ… โœ… โœ… โœ…
Input/Output Info โœ… โŒ* โœ… โŒ*
Model Metadata โœ… โŒ* โœ… โŒ*
Data Type Conversion โœ… โœ… โœ… โœ…
FP16 Support โœ… โŒ** โœ๏ธ โŒ**

โœ…: Complete

โŒ: Not supported

๐Ÿšง: Ongoing

โœ๏ธ: Planned

*: Retrieving model metadata and input/output info is not avialable in onnxruntime-objc, only names available.

**: Swift does not support FP16 type.

๐Ÿ› ๏ธ Troubleshooting

For troubleshooting, see the troubleshooting.md file.

๐Ÿค Contributing

Contributions to the Flutter ONNX Runtime plugin are welcome. Please see the contributing.md file for more information.

๐Ÿ“š Documentation

Find more information in the documentation.