Fine-tuned phi-4 on GenAIScript

This model is a fine-tuned version of microsoft/phi-4 on the GenAIScript training dataset. The base phi-4 model has no prior knowledge of the GenAIScript scripting language, as it was not part of its pretraining data. This fine-tuned version has been specifically trained to understand and generate valid GenAIScript code.

Model Description

  • Base model: microsoft/phi-4
  • Fine-tuned on: igor273/genaiscript_training_dataset
  • Task: Code generation and completion for GenAIScript
  • Quantized: Yes — optimized for local inference on resource-constrained machines

Dataset

The dataset was created from official Microsoft GenAIScript documentation and real-world code snippets. It includes:

  • Script generation examples
  • Function usage and syntax patterns
  • Control structures and logic flows
  • Valid use cases and best practices

Capabilities

  • Fully understands GenAIScript syntax and semantics
  • Can generate end-to-end scripts from natural language prompts
  • Can assist in learning and exploring GenAIScript capabilities

Limitations

  • May require updates if the GenAIScript specification evolves
  • Quantization may reduce generation precision in some edge cases

License

The base model phi-4 and the dataset are subject to their respective licenses. This fine-tuned version inherits those terms.


Maintained by @igor273

Downloads last month
14
Safetensors
Model size
15B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for igor273/phi-4-genaiscript

Base model

microsoft/phi-4
Quantized
(97)
this model
Quantizations
1 model