Llama 3.2 Collection This collection hosts the transformers and original repos of the Llama 3.2 and Llama Guard 3 ā¢ 11 items ā¢ Updated 11 days ago ā¢ 329
Jamba-1.5 Collection The AI21 Jamba family of models are state-of-the-art, hybrid SSM-Transformer instruction following foundation models ā¢ 2 items ā¢ Updated Aug 22 ā¢ 76
Gemma 2: Improving Open Language Models at a Practical Size Paper ā¢ 2408.00118 ā¢ Published Jul 31 ā¢ 73
view article Article Metric and Relative Monocular Depth Estimation: An Overview. Fine-Tuning Depth Anything V2 š š By Isayoften ā¢ Jul 10 ā¢ 32
view article Article LAVE: Zero-shot VQA Evaluation on Docmatix with LLMs - Do We Still Need Fine-Tuning? Jul 25 ā¢ 18
view article Article Fine-tuning Florence-2 - Microsoft's Cutting-edge Vision Language Models Jun 24 ā¢ 171
Depth Anything v2 Release Collection A comprehensive collection on DAv2 ā¢ 5 items ā¢ Updated Jun 18 ā¢ 10
MobileNetV4 pretrained weights Collection Weights for MobileNet-V4 pretrained in timm ā¢ 17 items ā¢ Updated 14 days ago ā¢ 13
Scaling Laws and Compute-Optimal Training Beyond Fixed Training Durations Paper ā¢ 2405.18392 ā¢ Published May 28 ā¢ 12
view article Article Multimodal Augmentation for Documents: Recovering āComprehensionā in āReading and Comprehensionā task By danaaubakirova ā¢ May 16 ā¢ 17
view article Article A Dive into Pretraining Strategies for Vision-Language Models Feb 3, 2023 ā¢ 36
PaliGemma Release Collection Pretrained and mix checkpoints for PaliGemma ā¢ 16 items ā¢ Updated Jul 31 ā¢ 136