[2024.05.24] We release the MiniCPM-Llama3-V 2.5 gguf, wh... | [2024.05.24] We release the MiniCPM-Llama3-V 2.5 gguf, wh...
[2024.05.24] We release the MiniCPM-Llama3-V 2.5 gguf, wh...
11:44 · Aug 11, 2024 · Sun
[2024.05.24] We release the MiniCPM-Llama3-V 2.5 gguf, which supports llama.cpp inference and provides a 6~8 token/s smooth decoding on mobile phones. Try it now!
View More Posts
Home
Ai Devin
Zero-Gpu-Spaces
FREE FLUX.1-dev
FREE FLUX.1-schnell
SF3D: Stable Fast 3D
huggingface
FLUX
Powered by
BroadcastChannel
&
Sepia
Serp Checker