lcpp 0.2.6
lcpp: ^0.2.6 copied to clipboard
lcpp is a dart implementation of llama.cpp used by the mobile artificial intelligence distribution (maid) for local model inference.
0.0.1 #
Initial Release
0.0.2 #
Small Fixes
0.0.3 #
Add documentation
Improve ChatMessage
class
0.0.4 #
Change pods to lcpp
0.0.5 #
Remove stop
method (doesnt work)
0.0.6 #
Fix free
0.0.7 #
Add fromMap
and toMap
methods to ChatMessage
0.0.8 #
Re-Add stop
method and improve cleanup logic
0.0.9 #
Add reload
method to Llama
and defaultParams
factory to ContextParams
0.1.0 #
Adjust json serialization
0.1.1 #
Small fix to json
0.1.2 #
Small fix to json
0.1.3 #
Fix regen
0.1.4 #
Cmake for android
0.1.5 #
Cmake args for android
0.1.6 #
Small fix for F-Droid
0.1.7 #
Update Llama.cpp Improve CI Enable IOS support
0.1.8 #
Small fix for ios
0.1.9 #
Update llama.cpp Small changes to params
0.2.0 #
Refactor update llama.cpp Add IOS action
0.2.1 #
Update llama.cpp
0.2.2 #
Refactor Update llama.cpp
0.2.3 #
Fix IOS build Enable android optimizations Update llama.cpp
0.2.4 #
WASM Ready
0.2.5 #
Update llama.cpp Fix issue with messages
0.2.6 #
Update llama.cpp Page size fix