Compared to its predecessor, Mistral Large 2 is significantly more capable in code generation, mathematics, and reasoning. It also provides a much stronger multilingual support, and advanced function calling capabilities.
Immediately following Meta's release of Meta Llama 3.1, Mistral is out with Mistral Large 2, with "a 128k context window [that] supports dozens of languages including French, German, Spanish, Italian, Portuguese, Arabic, Hindi, Russian, Chinese, Japanese, and Korean, along with 80+ coding languages including Python, Java, C, C++, JavaScript, and Bash."
@chrismessina Congratulations on your product launch! I have used Mistral, it's really cool. I am surprise there is only so few upvote. Thank you for your hard work and dedication.
When trying out the snippet shown in the hugging face page it returns an error:
from huggingface_hub import InferenceClient
import os
client = InferenceClient(
"mistralai/Mistral-Large-Instruct-2407",
token=os.getenv("hf_token"),
)
for message in client.chat_completion(
messages=[{"role": "user", "content": "What is the capital of France?"}],
max_tokens=500,
stream=True,
):
print(message.choices[0].delta.content, end="")
403 Forbidden: None.
Cannot access content at: https://api-inference.huggingfac....
If you are trying to create or update content, make sure you have a token with the `write` role.
The model mistralai/Mistral-Large-Instruct-2407 is too large to be loaded automatically (245GB > 10GB). Please use Spaces (https://huggingface.co/spaces) or Inference Endpoints (https://huggingface.co/inference...).
Congrats on the launch, the Mistra team must have put in a tremendous amount of effort into this.
No offense intended, but I've always wondered why smaller teams continue to work on similar projects when giants like OpenAI and Facebook already have such powerful offerings.
Wow, Mistral Large 2 sounds like a game changer! The improved code generation and reasoning are seriously impressive. Plus, that multilingual support is such a win for global users. I can't wait to see how this advances coding efficiency with those 80+ languages! Keep up the awesome work!