Command Palette
Search for a command to run...
Deepseek
FreeWeb
open-source large language model (LLM), it leverages the Mixture of Experts (MoE) architecture to optimize performance, activating only 37 billion of its 671 billion parameters during processing.
Pricing
Free
Platforms
Web