pythia-31m (fp32)

This is EleutherAI/pythia-31m but saved explicitly in fp32 - see safetensors params. It is smaller than the other 'official' checkpoints included in the Pythia study.

config/info

{
  "_name_or_path": "EleutherAI/pythia-31m",
  "architectures": [
    "GPTNeoXForCausalLM"
  ],
  "attention_dropout": 0.0,
  "bos_token_id": 0,
  "classifier_dropout": 0.1,
  "eos_token_id": 0,
  "hidden_act": "gelu",
  "hidden_dropout": 0.0,
  "hidden_size": 256,
  "initializer_range": 0.02,
  "intermediate_size": 1024,
  "layer_norm_eps": 1e-05,
  "max_position_embeddings": 2048,
  "model_type": "gpt_neox",
  "num_attention_heads": 8,
  "num_hidden_layers": 6,
  "rope_scaling": null,
  "rotary_emb_base": 10000,
  "rotary_pct": 0.25,
  "tie_word_embeddings": false,
  "torch_dtype": "float32",
  "transformers_version": "4.33.1",
  "use_cache": true,
  "use_parallel_residual": true,
  "vocab_size": 50304
}
Downloads last month
10
Safetensors
Model size
30.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ethzanalytics/pythia-31m

Finetuned
(140)
this model

Dataset used to train ethzanalytics/pythia-31m