Azure OpenAI follows a similar fine-tuning process as OpenAI, with some Azure-specific configurations. The examples below show how to use Portkey with Azure OpenAI for fine-tuning.

Upload a file

from portkey_ai import Portkey

# Initialize the Portkey client
portkey = Portkey(
    api_key="PORTKEY_API_KEY", # Replace with your Portkey API key
    virtual_key="VIRTUAL_KEY" # Add your provider's virtual key for Azure OpenAI
)

# Upload a file for fine-tuning
file = portkey.files.create(
    file="dataset.jsonl",
    purpose="fine-tune"
)

print(file)

Create a fine-tuning job

from portkey_ai import Portkey

# Initialize the Portkey client
portkey = Portkey(
    api_key="PORTKEY_API_KEY", # Replace with your Portkey API key
    virtual_key="VIRTUAL_KEY" # Add your provider's virtual key for Azure OpenAI
)

# Create a fine-tuning job
fine_tune_job = portkey.fine_tuning.jobs.create(
    model="gpt-35-turbo", # Base model to fine-tune
    training_file="file_id", # ID of the uploaded training file
    validation_file="file_id", # Optional: ID of the uploaded validation file
    suffix="finetune_name", # Custom suffix for the fine-tuned model name
    hyperparameters={
        "n_epochs": 1
    }
)

print(fine_tune_job)

For more detailed examples and other fine-tuning operations (listing jobs, retrieving job details, canceling jobs, and getting job events), please refer to the OpenAI fine-tuning documentation.

The Azure OpenAI fine-tuning API documentation is available at Azure OpenAI API.