WizardLM-2 8x22B is the state-of-the-art Mixture of Experts (MoE) model developed by Microsoft's WizardLM team, with significant performance improvements in complex dialogues, multilingual tasks, reasoning, and agent tasks.
Large Language Model
Transformers