[2024.06.03] Now, you can run MiniCPM-Llama3-V 2.5 on mul... | [2024.06.03] Now, you can run MiniCPM-Llama3-V 2.5 on mul...
[2024.06.03] Now, you can run MiniCPM-Llama3-V 2.5 on multiple low VRAM GPUs(12 GB or 16 GB) by distributing the model's layers across multiple GPUs. For more details, Check this link.