a year ago
Lyra 12B + 16K Context
We're happy to let you know that Lyra 12B is now available for True Supporters and I'm All In subscribers.
This new model is fine-tuned by Sao10k (Stheno) based on the recent Mistral Nemo 12B architecture. This newer model architecture supports a longer context window and is particularly strong in English, French, German, Spanish, Italian, Portuguese, Chinese, Japanese, Korean, Arabic, and Hindi. In our tests over the last weeks, it also outperforms all other smaller models with higher positive user ratings. We're also introducing 16K Context support for I'm All In subscribers with this model.
This means twice as many messages will fit in the model memory, even without the help of Semantic Memory 2.0. However, expect slower responses when going over the 8K context.