nielsr HF staff commited on
Commit
6d183ae
1 Parent(s): 7ed199b

Add results table

Browse files
Files changed (1) hide show
  1. README.md +17 -0
README.md CHANGED
@@ -19,6 +19,23 @@ The other (non-default) version which can be used is:
19
  Disclaimer: The team releasing TAPAS did not write a model card for this model so this model card has been written by
20
  the Hugging Face team and contributors.
21
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
22
  ## Model description
23
 
24
  TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion.
 
19
  Disclaimer: The team releasing TAPAS did not write a model card for this model so this model card has been written by
20
  the Hugging Face team and contributors.
21
 
22
+ ## Results
23
+
24
+ Size | Reset | Dev Accuracy | Link
25
+ -------- | --------| -------- | ----
26
+ LARGE | noreset | 0.5062 | [tapas-large-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-large-finetuned-wtq/tree/no_reset)
27
+ LARGE | reset | 0.5097 | [tapas-large-finetuned-wtq](https://huggingface.co/google/tapas-large-finetuned-wtq/tree/main)
28
+ BASE | noreset | 0.4525 | [tapas-base-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-base-finetuned-wtq/tree/no_reset)
29
+ BASE | reset | 0.4638 | [tapas-base-finetuned-wtq](https://huggingface.co/google/tapas-base-finetuned-wtq/tree/main)
30
+ MEDIUM | noreset | 0.4324 | [tapas-medium-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-medium-finetuned-wtq/tree/no_reset)
31
+ MEDIUM | reset | 0.4324 | [tapas-medium-finetuned-wtq](https://huggingface.co/google/tapas-medium-finetuned-wtq/tree/main)
32
+ SMALL | noreset | 0.3681 | [tapas-small-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-small-finetuned-wtq/tree/no_reset)
33
+ SMALL | reset | 0.3762 | [tapas-small-finetuned-wtq](https://huggingface.co/google/tapas-small-finetuned-wtq/tree/main)
34
+ **MINI** | **noreset** | **0.2783** | [tapas-mini-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-mini-finetuned-wtq/tree/no_reset)
35
+ **MINI** | **reset** | **0.2854** | [tapas-mini-finetuned-wtq](https://huggingface.co/google/tapas-mini-finetuned-wtq/tree/main)
36
+ TINY | noreset | 0.0823 | [tapas-tiny-finetuned-wtq (with absolute pos embeddings)](https://huggingface.co/google/tapas-tiny-finetuned-wtq/tree/no_reset)
37
+ TINY | reset | 0.1039 | [tapas-tiny-finetuned-wtq](https://huggingface.co/google/tapas-tiny-finetuned-wtq/tree/main)
38
+
39
  ## Model description
40
 
41
  TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion.