fastapi[standard] uvicorn langchain langchain-core llama-cpp-python --extra-index-url https://abetlen.github.io/llama-cpp-python/whl/cpu huggingface-hub==0.19.4 torch torchvision fal-client