Fill-Mask
Transformers
PyTorch
Joblib
DNA
biology
genomics
custom_code
Inference Endpoints
hdallatorre commited on
Commit
9d0067a
1 Parent(s): a973934

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -45,7 +45,7 @@ model = AutoModelForMaskedLM.from_pretrained("InstaDeepAI/nucleotide-transformer
45
  # Choose the length to which the input sequences are padded. By default, the
46
  # model max length is chosen, but feel free to decrease it as the time taken to
47
  # obtain the embeddings increases significantly with it.
48
- max_length = 15
49
 
50
  # Create a dummy dna sequence and tokenize it
51
  sequences = ["ATTCCGATTCCGATTCCG", "ATTTCTCTCTCTCTCTGAGATCGATCGATCGAT"]
 
45
  # Choose the length to which the input sequences are padded. By default, the
46
  # model max length is chosen, but feel free to decrease it as the time taken to
47
  # obtain the embeddings increases significantly with it.
48
+ max_length = tokenizer.model_max_length
49
 
50
  # Create a dummy dna sequence and tokenize it
51
  sequences = ["ATTCCGATTCCGATTCCG", "ATTTCTCTCTCTCTCTGAGATCGATCGATCGAT"]