IBM announced the availability of the popular open-source Mixtral-8x7B large language model (LLM), developed by Mistral AI, on its watsonx AI and data platform, as it continues to expand capabilities to help clients innovate with IBM’s own foundation models and those from a range of open-source providers.

The addition of Mixtral-8x7B expands IBM’s open, multi-model strategy to meet clients where they are and give them choice and flexibility to scale enterprise AI solutions across their businesses.

Mixtral-8x7B was built using a combination of Sparse modeling — a technique that finds and uses only the most essential parts of data to create more efficient models — and the Mixture-of-Experts technique, which combines different models (“experts”) that specialize in and solve different parts of a problem. The Mixtral-8x7B model is widely known for its ability to rapidly process and analyze vast amounts of data to provide context-relevant insights.

This week, IBM also announced the availability of ELYZA-japanese-Llama-2-7b, a Japanese LLM model open-sourced by ELYZA Corporation, on watsonx. IBM also offers Meta’s open-source models Llama-2-13B-chat and Llama-2-70B-chat and other third-party models on watsonx.

https://newsroom.ibm.com/2024-02-29-IBM-Announces-Availability-of-Open-Source-Mistral-AI-Model-on-watsonx,-Expands-Model-Choice-to-Help-Enterprises-Scale-AI-with-Trust-and-Flexibility