🐣Ai2 Releasing OLMoE!
OLMoE-1B-7B-Instruct is a Mixture-of-Experts LLM with 1B active and 7B total parameters, and, OLMoE is 100% open-source in model, code-base, datasets!
🦖Paper: https://arxiv.org/abs/2409.02060
🤗Model:
allenai/OLMoE-1B-7B-0924-Instruct
💾Datasets:
allenai/OLMoE-mix-0924
🙇♂️Demo:
vilarin/OLMoE https://huggingface.co/spaces/vilarin/OLMoE
OLMoE-1B-7B-Instruct is a Mixture-of-Experts LLM with 1B active and 7B total parameters, and, OLMoE is 100% open-source in model, code-base, datasets!
🦖Paper: https://arxiv.org/abs/2409.02060
🤗Model:
allenai/OLMoE-1B-7B-0924-Instruct
💾Datasets:
allenai/OLMoE-mix-0924
🙇♂️Demo:
vilarin/OLMoE https://huggingface.co/spaces/vilarin/OLMoE