YiJina / requirements.txt
Tonic's picture
add flash attention
4bfcc47 unverified
raw
history blame
No virus
588 Bytes
huggingface_hub
einops
sentence-transformers
torch==2.2.0
transformers
openai
python-dotenv
chromadb
langchain-community
langchain-chroma
unstructured[all-docs]
libmagic
gradio
torch==2.2.0+cu121 # Ensure you're using the right PyTorch version with CUDA support
flash-attn==2.6.3 # Flash attention module
numpy<2 # Downgrade to avoid NumPy 2.0.1 conflicts
pybind11>=2.12 # Ensure compatibility for modules needing pybind11
# poppler
# tesseract
# libxml2
# libxslt
# git+https://github.com/xlang-ai/instructor-embedding.git@4721e7375afeb8fcb32400a13057f9348bb69392