whisper_ctx_init_openvino_encoder_with_state method
int
whisper_ctx_init_openvino_encoder_with_state(
- Pointer<
whisper_context> ctx, - Pointer<
whisper_state> state, - Pointer<
Char> model_path, - Pointer<
Char> device, - Pointer<
Char> cache_dir,
Given a context, enable use of OpenVINO for encode inference. model_path: Optional path to OpenVINO encoder IR model. If set to nullptr, the path will be generated from the ggml model path that was passed in to whisper_init_from_file. For example, if 'path_model' was "/path/to/ggml-base.en.bin", then OpenVINO IR model path will be assumed to be "/path/to/ggml-base.en-encoder-openvino.xml". device: OpenVINO device to run inference on ("CPU", "GPU", etc.) cache_dir: Optional cache directory that can speed up init time, especially for GPU, by caching compiled 'blobs' there. Set to nullptr if not used. Returns 0 on success. If OpenVINO is not enabled in build, this simply returns 1.
Implementation
int whisper_ctx_init_openvino_encoder_with_state(
ffi.Pointer<whisper_context> ctx,
ffi.Pointer<whisper_state> state,
ffi.Pointer<ffi.Char> model_path,
ffi.Pointer<ffi.Char> device,
ffi.Pointer<ffi.Char> cache_dir,
) {
return _whisper_ctx_init_openvino_encoder_with_state(
ctx,
state,
model_path,
device,
cache_dir,
);
}