Experimenters have had overnight tests confirming they have OPEN SOURCE DeepSeek R1 running at 200 tokens per second on a NON-INTERNET connected Raspberry Pi.
Even though it is the smallest of the distilled models that model still outperforms GPT 4o and Claude Sonnet 3.5.
The 7B parameter models crush the older models on performance benchmarks. The 14 billion parameter model is very competitive with OpenAI o1 mini in many metrics.
Yea sounds like it’s their smallest model