Ultralytics YOLO Flutter Package
Flutter plugin for YOLO (You Only Look Once) models, supporting object detection, segmentation, classification, pose estimation and oriented bounding boxes (OBB) on both Android and iOS.
Features
- Object Detection: Identify and locate objects in images and camera feeds with bounding boxes
- Segmentation: Perform pixel-level segmentation of objects
- Classification: Classify objects in images
- Pose Estimation: Detect human poses and keypoints
- Oriented Bounding Boxes (OBB): Detect rotated or oriented bounding boxes for objects
- Cross-Platform: Works on both Android and iOS
- Real-time Processing: Optimized for real-time inference on mobile devices
- Camera Integration: Easy integration with device cameras
Installation
Add this to your package's pubspec.yaml
file:
dependencies:
ultralytics_yolo: ^0.0.5
Then run:
flutter pub get
Platform-Specific Setup
Android
Add the following permissions to your AndroidManifest.xml
file:
<!-- For camera access -->
<uses-permission android:name="android.permission.CAMERA" />
<!-- For accessing images from storage -->
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
Set minimum SDK version in your android/app/build.gradle
:
minSdkVersion 21
iOS
Add these entries to your Info.plist
:
<key>NSCameraUsageDescription</key>
<string>This app needs camera access to detect objects</string>
<key>NSPhotoLibraryUsageDescription</key>
<string>This app needs photos access to get images for object detection</string>
Usage
Basic Example
import 'package:flutter/material.dart';
import 'package:ultralytics_yolo/yolo.dart';
import 'package:ultralytics_yolo/yolo_view.dart';
import 'package:ultralytics_yolo/yolo_task.dart';
class YoloDemo extends StatelessWidget {
// Create a controller to interact with the YoloView
final controller = YoloViewController();
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(title: Text('YOLO Object Detection')),
body: Column(
children: [
// Controls for adjusting detection parameters
Padding(
padding: const EdgeInsets.all(8.0),
child: Row(
children: [
Text('Confidence: '),
Slider(
value: 0.5,
min: 0.1,
max: 0.9,
onChanged: (value) {
// Update confidence threshold
controller.setConfidenceThreshold(value);
},
),
],
),
),
// YoloView with controller
Expanded(
child: YoloView(
controller: controller,
task: YOLOTask.detect,
modelPath: 'assets/models/yolo11n.tflite',
onResult: (results) {
// Handle detection results
print('Detected ${results.length} objects');
},
),
),
],
),
);
}
@override
void initState() {
super.initState();
// Set initial detection parameters
controller.setThresholds(
confidenceThreshold: 0.5,
iouThreshold: 0.45,
);
}
}
Object Detection with Camera Feed
There are three ways to control YoloView's detection parameters:
Method 1: Using a Controller (Recommended)
// Create a controller outside build method
final controller = YoloViewController();
// In your build method:
YoloView(
controller: controller, // Provide the controller
task: YOLOTask.detect,
modelPath: 'assets/models/yolo11n.tflite',
onResult: (results) {
for (var result in results) {
print('Detected: ${result.className}, Confidence: ${result.confidence}');
}
},
)
// Set detection parameters anywhere in your code
controller.setConfidenceThreshold(0.5);
controller.setIoUThreshold(0.45);
// Or set both at once
controller.setThresholds(
confidenceThreshold: 0.5,
iouThreshold: 0.45,
);
Method 2: Using GlobalKey Direct Access (Simpler)
// Create a GlobalKey to access the YoloView
final yoloViewKey = GlobalKey<YoloViewState>();
// In your build method:
YoloView(
key: yoloViewKey, // Important: Provide the key
task: YOLOTask.detect,
modelPath: 'assets/models/yolo11n.tflite',
onResult: (results) {
for (var result in results) {
print('Detected: ${result.className}, Confidence: ${result.confidence}');
}
},
)
// Set detection parameters directly through the key
yoloViewKey.currentState?.setConfidenceThreshold(0.6);
yoloViewKey.currentState?.setIoUThreshold(0.5);
// Or set both at once
yoloViewKey.currentState?.setThresholds(
confidenceThreshold: 0.6,
iouThreshold: 0.5,
);
Method 3: Automatic Controller (Simplest)
// No controller needed - just create the view
YoloView(
task: YOLOTask.detect,
modelPath: 'assets/models/yolo11n.tflite',
onResult: (results) {
for (var result in results) {
print('Detected: ${result.className}, Confidence: ${result.confidence}');
}
},
)
// A controller is automatically created internally
// with default threshold values (0.5 for confidence, 0.45 for IoU)
Image Segmentation
// Simplest approach - no controller needed
YoloView(
task: YOLOTask.segment,
modelPath: 'assets/models/yolo11n-seg.tflite',
onResult: (results) {
// Process segmentation results
},
)
// An internal controller is automatically created
// with default thresholds (0.5 confidence, 0.45 IoU)
Pose Estimation
// Using the GlobalKey approach for direct access
final yoloViewKey = GlobalKey<YoloViewState>();
YoloView(
key: yoloViewKey,
task: YOLOTask.pose,
modelPath: 'assets/models/yolo11n-pose.tflite',
onResult: (results) {
// Process pose keypoints
},
)
// Update parameters directly through the key
yoloViewKey.currentState?.setConfidenceThreshold(0.6);
API Reference
Classes
YOLO
Main class for YOLO operations.
YOLO({
required String modelPath,
required YOLOTask task,
});
YoloViewController
Controller for interacting with a YoloView, managing settings like thresholds.
// Create a controller
final controller = YoloViewController();
// Get current values
double confidence = controller.confidenceThreshold;
double iou = controller.iouThreshold;
// Set confidence threshold (0.0-1.0)
await controller.setConfidenceThreshold(0.6);
// Set IoU threshold (0.0-1.0)
await controller.setIoUThreshold(0.5);
// Set both thresholds at once
await controller.setThresholds(
confidenceThreshold: 0.6,
iouThreshold: 0.5,
);
YoloView
Flutter widget to display YOLO detection results.
YoloView({
required YOLOTask task,
required String modelPath,
YoloViewController? controller, // Optional: Controller for managing view settings
Function(List<YOLOResult>)? onResult,
});
// YoloView methods (when accessed via GlobalKey<YoloViewState>)
Future<void> setConfidenceThreshold(double threshold);
Future<void> setIoUThreshold(double threshold);
Future<void> setThresholds({
double? confidenceThreshold,
double? iouThreshold,
});
Note: You can control YoloView in three ways:
- Provide a controller to the constructor
- Access the view directly via a GlobalKey
- Don't provide anything and let the view create an internal controller
See examples above for detailed usage patterns.
YOLOResult
Contains detection results.
class YOLOResult {
final int classIndex;
final String className;
final double confidence;
final Rect boundingBox;
// For segmentation
final List<List<double>>? mask;
// For pose estimation
final List<Point>? keypoints;
}
Enums
YOLOTask
enum YOLOTask {
detect, // Object detection
segment, // Image segmentation
classify, // Image classification
pose, // Pose estimation
obb, // Oriented bounding boxes
}
Platform Support
Android | iOS | Web | macOS | Windows | Linux |
---|---|---|---|---|---|
✅ | ✅ | ❌ | ❌ | ❌ | ❌ |
Model Loading
Model Placement Options
This package supports loading models from multiple locations:
-
Flutter Assets Directory
- Android: Place
.tflite
files in your Flutterassets
directory - iOS: Place
.mlmodel
files in your Flutterassets
directory - Specify in
pubspec.yaml
:flutter: assets: - assets/models/
- Reference in code:
modelPath: 'assets/models/your_model.tflite'
ormodelPath: 'your_model.tflite'
- Android: Place
-
App Internal Storage
- Use when downloading models at runtime
- Android path:
/data/user/0/<package_name>/app_flutter/
- iOS path:
/Users/<username>/Library/Application Support/<bundle_id>/
- Reference using the
internal://
scheme:modelPath: 'internal://models/your_model.tflite'
- Or with absolute path:
modelPath: '/absolute/path/to/your_model.tflite'
-
iOS-specific - Bundle Resources
.mlpackage
files must be added directly to the iOS Runner target in Xcode- Reference with just the name:
modelPath: 'your_model'
Path Reference Behavior by Platform
Android Path Resolution
-
Base Model Names:
modelPath: 'yolo11n'
- Automatically appends
.tflite
extension →yolo11n.tflite
- First searches in assets for
yolo11n.tflite
- Falls back to assets for
yolo11n
if not found
- Automatically appends
-
Asset Paths:
modelPath: 'assets/models/yolo11n.tflite'
- Searches in assets for the exact path
-
App Internal Storage:
modelPath: 'internal://models/yolo11n.tflite'
- Resolves to
/data/user/0/<package_name>/app_flutter/models/yolo11n.tflite
- Resolves to
-
Absolute Paths:
modelPath: '/path/to/yolo11n.tflite'
- Used directly to load from file system
iOS Path Resolution
-
Base Model Names:
modelPath: 'yolo11n'
- Searches for resources in this order:
yolo11n.mlmodelc
in main bundleyolo11n.mlpackage
in main bundle
- Does not automatically append extensions for asset paths
- Searches for resources in this order:
-
Asset Paths:
modelPath: 'assets/models/yolo11n.mlmodel'
- For
.mlmodel
files, searches in Flutter assets .mlpackage
files cannot be loaded from assets
- For
-
Absolute Paths:
modelPath: '/path/to/model.mlmodel'
- Used directly if file exists and has valid extension (
.mlmodel
or.mlpackage
)
- Used directly if file exists and has valid extension (
Platform-Specific Model Format Notes
-
Android: Uses TensorFlow Lite (
.tflite
) models- Extension is automatically appended if missing
- Always use TFLite format models
- More flexible path handling with fallback mechanisms
-
iOS: Uses Core ML models
- Supports
.mlmodel
,.mlmodelc
(compiled), and.mlpackage
formats - Important:
.mlpackage
files cannot be accessed from Flutter assets .mlpackage
files must be added directly to the iOS Runner target in Xcode.mlmodel
files can be placed in Flutter assets- Stricter path resolution compared to Android
- Supports
You can get the available storage paths at runtime:
final paths = await YOLO.getStoragePaths();
print("Internal storage path: ${paths['internal']}");
Troubleshooting
Common Issues
-
Model loading fails
- Make sure your model file is correctly placed as described above
- Verify that the model path is correctly specified
- For iOS, ensure
.mlpackage
files are added directly to the Xcode project - Check that the model format is compatible with TFLite (Android) or Core ML (iOS)
- Use
YOLO.checkModelExists(modelPath)
to verify if your model can be found
-
Low performance on older devices
- Try using smaller models (e.g., YOLOv8n instead of YOLOv8l)
- Reduce input image resolution
- Increase confidence threshold to reduce the number of detections:
// Using controller yoloController.setConfidenceThreshold(0.7); // Higher value = fewer detections // Or using GlobalKey yoloViewKey.currentState?.setConfidenceThreshold(0.7);
- Adjust IoU threshold to control overlapping detections:
// Using controller yoloController.setIoUThreshold(0.5); // Higher value = fewer merged boxes // Or using GlobalKey yoloViewKey.currentState?.setIoUThreshold(0.5);
-
Camera permission issues
- Ensure that your app has the proper permissions in the manifest or Info.plist
- Handle runtime permissions properly in your app
License
This project is licensed under the GNU Affero General Public License v3.0 (AGPL-3.0) - see the LICENSE file for details.