• cm0002@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    8
    ·
    4 days ago

    Even though it is the smallest of the distilled models that model still outperforms GPT 4o and Claude Sonnet 3.5.

    The 7B parameter models crush the older models on performance benchmarks. The 14 billion parameter model is very competitive with OpenAI o1 mini in many metrics.

    Yea sounds like it’s their smallest model