phi3.5-phunction-calling Model Card

Model Overview

Model Name: phi3.5-phunction-calling

Description: This model is a fine-tuned version of the phi3.5 model, specifically designed for function calling tasks. It has been optimized to understand and execute function calls accurately and efficiently.

Intended Use

Primary Use Case: This model is intended for use in applications where function calling is a critical component, such as automated assistants, code generation, and API interaction.

Limitations: While the model is highly accurate, it may still produce errors or misunderstandings in complex or ambiguous function calls.

from unsloth.chat_templates import get_chat_template

tokenizer = get_chat_template(
    tokenizer,
    chat_template = "phi-3", # Supports zephyr, chatml, mistral, llama, alpaca, vicuna, vicuna_old, unsloth
    mapping = {"role" : "from", "content" : "value", "user" : "human", "assistant" : "gpt"}, # ShareGPT style
)

FastLanguageModel.for_inference(model) # Enable native 2x faster inference

messages = [
    {"from": "system", "value": "You are a function calling AI model. You are provided with function signatures within <tools> </tools> XML tags. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions.\n<tools>\n[{'type': 'function', 'function': {'name': 'search_recipes', 'description': 'Searches for recipes based on given criteria.', 'parameters': {'type': 'object', 'properties': {'cuisine': {'type': 'string', 'description': 'The type of cuisine to search for.'}, 'dietary_restriction': {'type': 'string', 'description': 'Any dietary restrictions to consider.', 'enum': ['vegetarian', 'vegan', 'gluten-free', 'none']}}, 'required': ['cuisine']}}}, {'type': 'function', 'function': {'name': 'get_recipe_details', 'description': 'Retrieves detailed information about a specific recipe.', 'parameters': {'type': 'object', 'properties': {'recipe_id': {'type': 'string', 'description': 'The unique identifier for the recipe.'}}, 'required': ['recipe_id']}}}, {'type': 'function', 'function': {'name': 'calculate_nutrition', 'description': 'Calculates nutritional information for a given recipe.', 'parameters': {'type': 'object', 'properties': {'recipe_id': {'type': 'string', 'description': 'The unique identifier for the recipe.'}, 'serving_size': {'type': 'integer', 'description': 'The number of servings to calculate nutrition for.', 'default': 1}}, 'required': ['recipe_id']}}}]\n</tools>\nFor each function call return a json object with function name and arguments within <tool_call> </tool_call> tags with the following schema:\n<tool_call>\n{'arguments': <args-dict>, 'name': <function-name>}\n</tool_call>\n"},
    {"from": "human", "value": "I'm planning a dinner party and I'm looking for some Italian recipes to try. Can you help me find some vegetarian Italian dishes? Once we have a list, I'd like to get more details about the first recipe in the search results. Finally, I want to calculate the nutritional information for that recipe, assuming I'm cooking for 4 people. Can you please perform these tasks for me?"},
]

inputs = tokenizer.apply_chat_template(
    messages,
    tokenize = True,
    add_generation_prompt = True, # Must add for generation
    return_tensors = "pt",
).to("cuda")

outputs = model.generate(input_ids = inputs, max_new_tokens = 256, use_cache = True)
tokenizer.batch_decode(outputs)
Downloads last month
8
GGUF
Model size
3.82B params
Architecture
llama
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for ayan-sh003/phi3.5-phunction-calling-GGUF

Quantized
(128)
this model

Dataset used to train ayan-sh003/phi3.5-phunction-calling-GGUF