Post171Remember when @mistralAI said large enough and cas... | Post171Remember when @mistralAI said large enough and cas...
Post
171

Remember when @mistralAI said large enough and casually dropped Mistral-Large-Instruct-2407? πŸ€―πŸš€

It's now on http://lmsys.org! 🌐 It works amazing for instruction following, hard prompts, coding, and longer queries with only 123 billion parameters. πŸ’‘πŸ’»

It outperforms GPT4-Turbo and Claude 3 Opus on Coding, Hard Prompts, Math, and Longer Query categories. πŸ“ˆπŸ”’

It also outperforms Llama 3.1 405B on Instruction Following while being 3x smaller. πŸŽπŸ”

It also does exceedingly well on the Ai2 ZebraLogic logistic reasoning benchmark despite being much smaller than the other models. πŸ¦“πŸ€”

Mistral is not here to take part but to take over! πŸ†πŸŒŸ

Model: https://mistral.ai/news/mistral-large-2407/