[2024.05.28] ๐Ÿš€๐Ÿš€๐Ÿš€ MiniCPM-Llama3-V 2.5 now fully suppor... | [2024.05.28] ๐Ÿš€๐Ÿš€๐Ÿš€ MiniCPM-Llama3-V 2.5 now fully suppor...
[2024.05.28] ๐Ÿš€๐Ÿš€๐Ÿš€ MiniCPM-Llama3-V 2.5 now fully supports its feature in llama.cpp and ollama! Please pull the latest code of our provided forks (llama.cpp, ollama). GGUF models in various sizes are available here. MiniCPM-Llama3-V 2.5 series is not supported by the official repositories yet, and we are working hard to merge PRs. Please stay tuned! You can visit our GitHub repository for more information!
[2024.05.28] ๐Ÿ’ซ We now support LoRA fine-tuning for MiniCPM-Llama3-V 2.5, using only 2 V100 GPUs! See more statistics here.