docling-project/SmolDocling-256M-preview Image-Text-to-Text β’ Updated Sep 17, 2025 β’ 30.2k β’ 1.61k
Running 3.83k The Ultra-Scale Playbook π 3.83k The ultimate guide to training LLM on large GPU Clusters
nomic-ai/nomic-embed-text-v2-moe Sentence Similarity β’ 0.5B β’ Updated Apr 1, 2025 β’ 1.78M β’ 475
view article Article SmolVLM Grows Smaller β Introducing the 256M & 500M Models! +1 andito, mfarre, merve β’ Jan 23, 2025 β’ 192