Mixture-of-experts (MoE) is an architecture used in some AI and LLMs. DeepSeek garnered big headlines and uses MoE. Here are ...