bge-m3_miracl_2cr / README.md
hanhainebula's picture
Upload README.md with huggingface_hub
e80e4f3 verified
## Introduction
This respository introduces how to reproduce the `Dense`, `Sparse`, and `Dense+Sparse` evaluation results of the paper [BGE-M3](https://arxiv.org/pdf/2402.03216.pdf) on the [MIRACL](https://direct.mit.edu/tacl/article/doi/10.1162/tacl_a_00595/117438/MIRACL-A-Multilingual-Retrieval-Dataset-Covering) dev split.
## Requirements
```bash
# Install Java (Linux)
apt update
apt install openjdk-21-jdk
# Install Pyserini
pip install pyserini
# Install Faiss
## CPU version
conda install -c conda-forge faiss-cpu
## GPU version
conda install -c conda-forge faiss-gpu
```
**It should be noted that** the Pyserini code needs to be modified to support the multiple alpha settings in `pyserini/fusion`. I have already submitted a pull request to the official repository to support this feature. You can refer to this [PR](https://github.com/castorini/pyserini/pull/1858) to modify the code.
## 2CR
### Download and Unzip
```bash
# Download
## MIRACL topics and qrels
git clone https://huggingface.co/datasets/miracl/miracl
mv miracl/*/*/* topics-and-qrels
## Dense and Sparse Index
git lfs install
git clone https://huggingface.co/datasets/hanhainebula/bge-m3_miracl_2cr
cat bge-m3_miracl_2cr/dense/en.tar.gz.part_* > bge-m3_miracl_2cr/dense/en.tar.gz
cat bge-m3_miracl_2cr/dense/de.tar.gz.part_* > bge-m3_miracl_2cr/dense/de.tar.gz
# Unzip
languages=(ar bn en es fa fi fr hi id ja ko ru sw te th zh de yo)
## Dense
for lang in ${languages[@]}; do
tar -zxvf bge-m3_miracl_2cr/dense/${lang}.tar.gz -C bge-m3_miracl_2cr/dense/
done
## Sparse
for lang in ${languages[@]}; do
tar -zxvf bge-m3_miracl_2cr/sparse/${lang}.tar.gz -C bge-m3_miracl_2cr/sparse/
done
```
### Reproduction
#### Dense
```bash
# Avaliable Language: ar bn en es fa fi fr hi id ja ko ru sw te th zh de yo
lang=zh
# Generate run
python -m pyserini.search.faiss \
--threads 16 --batch-size 512 \
--encoder-class auto \
--encoder BAAI/bge-m3 \
--pooling cls --l2-norm \
--topics topics-and-qrels/topics.miracl-v1.0-${lang}-dev.tsv \
--index bge-m3_miracl_2cr/dense/${lang} \
--output bge-m3_miracl_2cr/dense/runs/${lang}.txt \
--hits 1000
# Evaluate
## nDCG@10
python -m pyserini.eval.trec_eval \
-c -M 100 -m ndcg_cut.10 \
topics-and-qrels/qrels.miracl-v1.0-${lang}-dev.tsv \
bge-m3_miracl_2cr/dense/runs/${lang}.txt
## Recall@100
python -m pyserini.eval.trec_eval \
-c -m recall.100 \
topics-and-qrels/qrels.miracl-v1.0-${lang}-dev.tsv \
bge-m3_miracl_2cr/dense/runs/${lang}.txt
```
#### Sparse
```bash
# Avaliable Language: ar bn en es fa fi fr hi id ja ko ru sw te th zh de yo
lang=zh
# Generate run
python -m pyserini.search.lucene \
--threads 16 --batch-size 128 \
--topics bge-m3_miracl_2cr/sparse/${lang}/query_embd.tsv \
--index bge-m3_miracl_2cr/sparse/${lang}/index \
--output bge-m3_miracl_2cr/sparse/runs/${lang}.txt \
--output-format trec \
--impact --hits 1000
# Evaluate
## nDCG@10
python -m pyserini.eval.trec_eval \
-c -M 100 -m ndcg_cut.10 \
topics-and-qrels/qrels.miracl-v1.0-${lang}-dev.tsv \
bge-m3_miracl_2cr/sparse/runs/${lang}.txt
## Recall@100
python -m pyserini.eval.trec_eval \
-c -m recall.100 \
topics-and-qrels/qrels.miracl-v1.0-${lang}-dev.tsv \
bge-m3_miracl_2cr/sparse/runs/${lang}.txt
```
#### Dense+Sparse
**Note**: You should first merge this [PR](https://github.com/castorini/pyserini/pull/1858) to support the multiple alpha settings in `pyserini/fusion`.
```bash
# Avaliable Language: ar bn en es fa fi fr hi id ja ko ru sw te th zh de yo
lang=zh
# Generate dense run and sparse run
python -m pyserini.search.faiss \
--threads 16 --batch-size 512 \
--encoder-class auto \
--encoder BAAI/bge-m3 \
--pooling cls --l2-norm \
--topics topics-and-qrels/topics.miracl-v1.0-${lang}-dev.tsv \
--index bge-m3_miracl_2cr/dense/${lang} \
--output bge-m3_miracl_2cr/dense/runs/${lang}.txt \
--hits 1000
python -m pyserini.search.lucene \
--threads 16 --batch-size 128 \
--topics bge-m3_miracl_2cr/sparse/${lang}/query_embd.tsv \
--index bge-m3_miracl_2cr/sparse/${lang}/index \
--output bge-m3_miracl_2cr/sparse/runs/${lang}.txt \
--output-format trec \
--impact --hits 1000
# Generate dense+sparse run
mkdir -p bge-m3_miracl_2cr/fusion/runs
python -m pyserini.fusion \
--method interpolation \
--runs bge-m3_miracl_2cr/dense/runs/${lang}.txt bge-m3_miracl_2cr/sparse/runs/${lang}.txt \
--alpha 1 3e-5 \
--output bge-m3_miracl_2cr/fusion/runs/${lang}.txt \
--depth 1000 --k 1000
# Evaluation
## nDCG@10
python -m pyserini.eval.trec_eval \
-c -M 100 -m ndcg_cut.10 \
topics-and-qrels/qrels.miracl-v1.0-${lang}-dev.tsv \
bge-m3_miracl_2cr/fusion/runs/${lang}.txt
## Recall@100
python -m pyserini.eval.trec_eval \
-c -m recall.100 \
topics-and-qrels/qrels.miracl-v1.0-${lang}-dev.tsv \
bge-m3_miracl_2cr/fusion/runs/${lang}.txt
```
Note:
- The hybrid method we used for MIRACL in BGE-M3 paper is: `s_dense + 0.3 * s_sparse`. But when the sparse score is calculated, it has already been multiplied by 100^2, so the alpha for sparse run here is 3e-5, instead of 0.3.