Hugging Face's open-source top-of-the-line model
Jin10 data reported on July 9th that in the early hours of today, the globally renowned large model open source platform Hugging Face has open sourced the top small parameter model SmolLM3. SmolLM3 has only 3 billion parameters, yet its performance significantly surpasses similar open source models such as Llama-3.2-3B and Qwen2.5-3B. It features a 128k context window and supports six languages including English, French, Spanish, and German. It supports both depth thinking and non-thinking dual reasoning modes, allowing users to switch flexibly.