tylercross commited on
Commit
51c0e23
1 Parent(s): b68a1c0

Upload 7 files

Browse files
Files changed (3) hide show
  1. README.md +7 -7
  2. adapter_config.json +4 -4
  3. adapter_model.bin +1 -1
README.md CHANGED
@@ -16,7 +16,7 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 2.3316
20
 
21
  ## Model description
22
 
@@ -51,12 +51,12 @@ The following hyperparameters were used during training:
51
  | Training Loss | Epoch | Step | Validation Loss |
52
  |:-------------:|:-----:|:----:|:---------------:|
53
  | 2.4833 | 0.13 | 1 | 2.5152 |
54
- | 2.5615 | 0.27 | 2 | 2.5080 |
55
- | 2.4974 | 0.4 | 3 | 2.4701 |
56
- | 2.3926 | 0.53 | 4 | 2.4257 |
57
- | 2.3646 | 0.67 | 5 | 2.3834 |
58
- | 2.2345 | 0.8 | 6 | 2.3447 |
59
- | 2.1912 | 0.93 | 7 | 2.3316 |
60
 
61
 
62
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 2.3313
20
 
21
  ## Model description
22
 
 
51
  | Training Loss | Epoch | Step | Validation Loss |
52
  |:-------------:|:-----:|:----:|:---------------:|
53
  | 2.4833 | 0.13 | 1 | 2.5152 |
54
+ | 2.5615 | 0.26 | 2 | 2.5078 |
55
+ | 2.4965 | 0.39 | 3 | 2.4691 |
56
+ | 2.3902 | 0.52 | 4 | 2.4249 |
57
+ | 2.3629 | 0.65 | 5 | 2.3824 |
58
+ | 2.2324 | 0.77 | 6 | 2.3441 |
59
+ | 2.1907 | 0.9 | 7 | 2.3313 |
60
 
61
 
62
  ### Framework versions
adapter_config.json CHANGED
@@ -16,13 +16,13 @@
16
  "rank_pattern": {},
17
  "revision": null,
18
  "target_modules": [
19
- "q_proj",
20
- "down_proj",
21
  "v_proj",
22
  "up_proj",
23
  "o_proj",
24
- "gate_proj",
25
- "k_proj"
 
26
  ],
27
  "task_type": "CAUSAL_LM"
28
  }
 
16
  "rank_pattern": {},
17
  "revision": null,
18
  "target_modules": [
19
+ "gate_proj",
 
20
  "v_proj",
21
  "up_proj",
22
  "o_proj",
23
+ "q_proj",
24
+ "k_proj",
25
+ "down_proj"
26
  ],
27
  "task_type": "CAUSAL_LM"
28
  }
adapter_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:beb31d0f2cb78ec193e995e4d798192928a24ab69de356455ef73a2e56d4bafb
3
  size 335706186
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:70c0859d3ab66e4c0ca38e9a1d5c4f55c26411dedc2ec00922ba89cc33b33535
3
  size 335706186