Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
Brand Logo

agnos.is Forums

  1. Home
  2. LocalLLaMA
  3. MistralAI releases `Magistral`, their first official reasoning models. magistral small 2506 released under apache 2.0 license!

MistralAI releases `Magistral`, their first official reasoning models. magistral small 2506 released under apache 2.0 license!

Scheduled Pinned Locked Moved LocalLLaMA
localllama
5 Posts 3 Posters 0 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • smokeydope@lemmy.worldS This user is from outside of this forum
    smokeydope@lemmy.worldS This user is from outside of this forum
    [email protected]
    wrote on last edited by [email protected]
    #1

    It seems mistral finally released their own version of a small 3.1 2503 with CoT reasoning pattern embedding. Before this the best CoT finetune of Small was DeepHermes with deepseeks r1 distill patterns. According to the technical report, mistral baked their own reasoning patterns for this one so its not just another deepseek distill finetune.

    HuggingFace

    Blog

    Magistral technical research academic paper

    eyekaytee@aussie.zoneE B 2 Replies Last reply
    28
    • smokeydope@lemmy.worldS [email protected]

      It seems mistral finally released their own version of a small 3.1 2503 with CoT reasoning pattern embedding. Before this the best CoT finetune of Small was DeepHermes with deepseeks r1 distill patterns. According to the technical report, mistral baked their own reasoning patterns for this one so its not just another deepseek distill finetune.

      HuggingFace

      Blog

      Magistral technical research academic paper

      eyekaytee@aussie.zoneE This user is from outside of this forum
      eyekaytee@aussie.zoneE This user is from outside of this forum
      [email protected]
      wrote on last edited by
      #2

      good work Mistral 🙂

      1 Reply Last reply
      5
      • smokeydope@lemmy.worldS [email protected]

        It seems mistral finally released their own version of a small 3.1 2503 with CoT reasoning pattern embedding. Before this the best CoT finetune of Small was DeepHermes with deepseeks r1 distill patterns. According to the technical report, mistral baked their own reasoning patterns for this one so its not just another deepseek distill finetune.

        HuggingFace

        Blog

        Magistral technical research academic paper

        B This user is from outside of this forum
        B This user is from outside of this forum
        [email protected]
        wrote on last edited by
        #3

        They're still holding back bigger ones, unfortunately.

        smokeydope@lemmy.worldS 1 Reply Last reply
        2
        • B [email protected]

          They're still holding back bigger ones, unfortunately.

          smokeydope@lemmy.worldS This user is from outside of this forum
          smokeydope@lemmy.worldS This user is from outside of this forum
          [email protected]
          wrote on last edited by
          #4

          Yes it would have been awesome of them to release a bigger one for sure 😞 At the end of the day they are still a business that needs a product to sell. I don't want to be ungrateful complaining that they dont give us everything. I expect some day all these companies will eventually clam up and stop releasing models to the public all together once the dust settles and monopolies are integrated. I'm happy to be here in an era where we can look forward to open licence model released every few months.

          B 1 Reply Last reply
          3
          • smokeydope@lemmy.worldS [email protected]

            Yes it would have been awesome of them to release a bigger one for sure 😞 At the end of the day they are still a business that needs a product to sell. I don't want to be ungrateful complaining that they dont give us everything. I expect some day all these companies will eventually clam up and stop releasing models to the public all together once the dust settles and monopolies are integrated. I'm happy to be here in an era where we can look forward to open licence model released every few months.

            B This user is from outside of this forum
            B This user is from outside of this forum
            [email protected]
            wrote on last edited by [email protected]
            #5

            I don’t want to be ungrateful complaining that they dont give us everything.

            For sure.

            But I guess it’s still kinda… interesting? Like you’d think Qwen3, Gemma 3, Falcon H1, Nemotron 49B and such would pressure them to release Medium, but I guess there are factors that help them sell it.

            As stupid as this is, they’re European and specifically not Chinese. In the business world, there’s this mostly irrational fear that the Deepseek or Qwen weights by themselves will jump out of their cage and hack you, heh.

            1 Reply Last reply
            1
            Reply
            • Reply as topic
            Log in to reply
            • Oldest to Newest
            • Newest to Oldest
            • Most Votes


            • Login

            • Login or register to search.
            • First post
              Last post
            0
            • Categories
            • Recent
            • Tags
            • Popular
            • World
            • Users
            • Groups