ollama 1.0.4 copy "ollama: ^1.0.4" to clipboard
ollama: ^1.0.4 copied to clipboard

Access Ollama API from Dart

Ollama for Dart #

A Dart client for interacting with the Ollama API. This library provides an easy-to-use interface for generating text completions, chat responses, and embeddings using Ollama inference engine.

Features #

  • Generate text completions
  • Generate chat responses
  • Generate embeddings
  • Support for streaming responses
  • Customizable model parameters

Installation #

Run the following command to install the package:

dart pub add ollama

Usage #

Initializing the client #

import 'package:ollama/ollama.dart';

final ollama = Ollama();
// Or with a custom base URL:
// final ollama = Ollama(baseUrl: Uri.parse('http://your-ollama-server:11434'));

Generating text completions #

final stream = ollama.generate(
  'Tell me a joke about programming',
  model: 'llama3',
);

await for (final chunk in stream) {
  print(chunk.response);
}

Generating chat responses #

final messages = [
  ChatMessage(role: 'user', content: 'Hello, how are you?'),
];

final stream = ollama.chat(
  messages,
  model: 'llama3',
);

await for (final chunk in stream) {
  print(chunk.message?.content);
}

Generating embeddings #

final embeddings = await ollama.embeddings(
  'Here is an article about llamas...',
  model: 'llama3',
);

print(embeddings);

Contributing #

Contributions are welcome! Please feel free to submit a Pull Request.

30
likes
140
points
1.02k
downloads

Publisher

unverified uploader

Weekly Downloads

Access Ollama API from Dart

Repository (GitHub)

Documentation

API reference

License

unknown (license)

More

Packages that depend on ollama