merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the Arcee Fusion merge method using google/medgemma-27b-text-it as a base.

Models Merged

The following models were included in the merge:

  • /content/gemma-3-proxy

Configuration

The following YAML configuration was used to produce this model:

base_model: google/medgemma-27b-text-it
dtype: bfloat16
merge_method: arcee_fusion
modules:
  default:
    slices:
    - sources:
      - layer_range: [0, 62]
        model: google/medgemma-27b-text-it
      - layer_range: [0, 62]
        model: /content/gemma-3-proxy
Downloads last month
385
Safetensors
Model size
27B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Gunulhona/Gemma-3-27B-Text-Only

Finetuned
(17)
this model