Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -50,8 +50,8 @@ Model evaluation metrics and results.
50
 
51
  | Benchmark | Metric | Llama-2-7b-instruct | Llama-2-7b-pruned50-retrained-instruct |
52
  |------------------------------------------------|---------------|-------------|-------------------------------|
53
- | [MMLU](https://arxiv.org/abs/2009.03300) | 5-shot, top-1 | 48.60% | 45.10% |
54
- | [HellaSwag](https://arxiv.org/abs/1905.07830) | 0-shot | 79.45% | 78.86% |
55
  | [WinoGrande](https://arxiv.org/abs/1907.10641) | 5-shot | 75.69% | 72.61% |
56
  | [ARC-c](https://arxiv.org/abs/1911.01547) | 25-shot | 53.92% | 50.77% |
57
  | [TruthfulQA](https://arxiv.org/abs/2109.07958) | 0-shot | 43.63% | 44.40% |
 
50
 
51
  | Benchmark | Metric | Llama-2-7b-instruct | Llama-2-7b-pruned50-retrained-instruct |
52
  |------------------------------------------------|---------------|-------------|-------------------------------|
53
+ | [MMLU](https://arxiv.org/abs/2009.03300) | 5-shot | 48.60% | 45.10% |
54
+ | [HellaSwag](https://arxiv.org/abs/1905.07830) | 10-shot | 79.45% | 78.86% |
55
  | [WinoGrande](https://arxiv.org/abs/1907.10641) | 5-shot | 75.69% | 72.61% |
56
  | [ARC-c](https://arxiv.org/abs/1911.01547) | 25-shot | 53.92% | 50.77% |
57
  | [TruthfulQA](https://arxiv.org/abs/2109.07958) | 0-shot | 43.63% | 44.40% |