For function calling purpose

config = {
  "rank": 8,
  "alpha": 16,
  "learning_rate": 2e-5,
  "target_modules": ["mlps"]
}
Downloads last month
14
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beyoru/SR2_funcCalling

Base model

Qwen/Qwen2.5-1.5B
Finetuned
(195)
this model

Collection including beyoru/SR2_funcCalling