๐Ÿ’ซ Easy Usage. MiniCPM-V 2.6 can be easily used in variou... | ๐Ÿ’ซ Easy Usage. MiniCPM-V 2.6 can be easily used in variou...
๐Ÿ’ซ Easy Usage. MiniCPM-V 2.6 can be easily used in various ways: (1) llama.cpp and ollama support for efficient CPU inference on local devices, (2) int4 and GGUF format quantized models in 16 sizes, (3) vLLM support for high-throughput and memory-efficient inference, (4) fine-tuning on new domains and tasks, (5) quick local WebUI demo setup with Gradio and (6) online web demo.https://github.com/OpenBMB/MiniCPM-V/raw/main/assets/radar_final.png