outetts_flutter 0.0.0
outetts_flutter: ^0.0.0 copied to clipboard
Wip OuteTTS Is Library for generate neural Text To Speech on Edge Device Without api key or internet quota
Outetts #
Outetts Is library for inference any model ai LLAMA / LLM On Edge without api or internet quota, but need resources depends model you want run
Copyright (c) 2024 GLOBAL CORPORATION - GENERAL DEVELOPER
đī¸ Docs #
- Documentation
- Youtube
- Telegram Support Group
- Contact Developer (check social media or readme profile github)
đī¸ Features #
- â đąī¸ Cross Platform support (Device, Edge Severless functions)
- â đī¸ Standarization Style Code
- â â¨ī¸ Cli (Terminal for help you use this library or create project)
- â đĨī¸ Api (If you developer bot / userbot you can use this library without interact cli just add library and use đī¸)
- â đ§Šī¸ Customizable Extension (if you want add extension so you can more speed up on development)
- â â¨ī¸ Pretty Information (user friendly for newbie)
âī¸ Fun Fact #
-
This library 100% use on every my create project (App, Server, Bot, Userbot)
-
This library 100% support all models from llama.cpp (depending on your device specs, if high then it can be up to turbo, but if low, just choose tiny/small)
đī¸ Proggres #
- 10-02-2025 Starting Release Stable With core Features
Resources #
đĨī¸ Install Library #
- Dart
dart pub add outetts_dart
- Flutter
flutter pub add outetts_flutter ggml_library_flutter
đī¸ Quick Start #
Example Quickstart script minimal for insight you or make you use this library because very simple
import 'dart:io';
import 'package:outetts/outetts.dart';
import 'package:outetts/raw/lcpp.dart';
void main(List<String> args) async {
print("start");
File modelFile = File("../../../../../big-data/llama/Meta-Llama-3.1-8B-Instruct.Q8_0.gguf");
final Outetts outetts = Outetts(
sharedLibraryPath: "../outetts_flutter/linux/libllama.so",
);
await outetts.ensureInitialized();
outetts.loadModel(modelPath: modelFile.path);
/// call this if you want use llama if in main page / or not in page llama
/// dont call if on low end specs device
/// if device can't handle
/// this program will auto exit because llama need reseources depends model
/// and fast with modern cpu
await outetts.initialized();
await for (final result in outetts.prompt(messages: [
ChatMessage(
role: "user",
content: "What is Linux?",
)
])) {
print(result);
}
await outetts.dispose();
outetts.stop();
outetts.close();
exit(0);
}
Reference #
- Ggerganov-llama.cpp ffi bridge main script so that this program can run
Copyright (c) 2024 GLOBAL CORPORATION - GENERAL DEVELOPER
Example Project Use This Library #
Telegram Application with redesign with new some features userbot and other features which is not officially provided on Telegram First this project open source but we closed it to close source because our program is easy to read and allows other people to edit the source code and then use it for criminal acts
CHAT PAGE | SIGN UP PAGE | HOME PAGE | GUIDE PAGE |
---|---|---|---|
![]() |
![]() |
![]() |
![]() |