Edit model card

我的DC sever

https://discord.gg/yaTfFF6Ut2

我正在計畫微調64K指令模型,請幫助我進行計畫

Support me here if you're interested: Ko-fi: https://ko-fi.com/ogodwin10。

更大的模型

是的,一但有資源我就會開始完成更大的模型,並微調這些模型。 如果您想要其他模型或是其他服務,請在DC上給我留言。我們可以談。

Breeze-13B-32k-Base-v1_0

Breeze-13B-32k-Base-v1_0 is a merge of the following models using mergekit:

🧩 Configuration

dtype: bfloat16
merge_method: passthrough
slices:
- sources:
  - layer_range: [0, 8]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
  - layer_range: [0, 8]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
- sources:
  - layer_range: [4, 12]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
  - layer_range: [4, 12]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
- sources:
  - layer_range: [8, 16]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
  - layer_range: [8, 16]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
- sources:
  - layer_range: [12, 20]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
  - layer_range: [12, 20]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
- sources:
  - layer_range: [16, 24]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
  - layer_range: [16, 24]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
- sources:
  - layer_range: [20, 28]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
  - layer_range: [20, 28]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
- sources:
  - layer_range: [24, 32]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0    
  - layer_range: [24, 32]
    model: MediaTek-Research/Breeze-7B-32k-Base-v1_0 
Downloads last month
4
Safetensors
Model size
12.7B params
Tensor type
BF16
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for win10/Breeze-13B-32k-Base-v1_0

Quantizations
2 models