license: cc-by-nc-4.0 | |
JAX weights converted from Torch checkpoint at `facebook/galactica-1.3b`. | |
```python | |
(env) ubuntu@vm:~$ JAX_PLATFORM_NAME=cpu python3 | |
>>> import jax | |
>>> print(jax.devices()) | |
[CpuDevice(id=0)] # Ensure that model weights are loaded into CPU RAM, not accelerator memory. | |
>>> from transformers import FlaxOPTForCausalLM | |
>>> model = FlaxOPTForCausalLM.from_pretrained("facebook/galactica-1.3b", from_pt=True) | |
>>> model.push_to_hub(hf_model_repo) | |
``` | |
## Citation and Attribution | |
Citation from the original repo is reproduced below as per the cc-by-nc-4.0 licsense. | |
```bibtex | |
@inproceedings{GALACTICA, | |
title={GALACTICA: A Large Language Model for Science}, | |
author={Ross Taylor and Marcin Kardas and Guillem Cucurull and Thomas Scialom and Anthony Hartshorn and Elvis Saravia and Andrew Poulton and Viktor Kerkez and Robert Stojnic}, | |
year={2022} | |
} | |
``` | |
> Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC) |