---
title: Interactive LLM labeling with GPT
type: guide
tier: all
order: 5
hide_menu: true
hide_frontmatter_title: true
meta_title: Interactive LLM labeling with OpenAI, Azure, or Ollama
meta_description: Label Studio tutorial for interactive LLM labeling with OpenAI, Azure, or Ollama
categories:
- Generative AI
- Large Language Model
- OpenAI
- Azure
- Ollama
- ChatGPT
image: "/guide/ml_tutorials/llm-interactive.png"
---
# Interactive LLM labeling
This example server connects Label Studio to [OpenAI](https://platform.openai.com/), [Ollama](https://ollama.com/),
or [Azure](https://azure.microsoft.com/en-us/products/ai-services/openai-service) API to interact with GPT chat models (
gpt-3.5-turbo, gpt-4, etc.).
The interactive flow allows you to perform the following scenarios:
* Autolabel data given an LLM prompt (e.g. "Classify this text as sarcastic or not")
* Collect pairs of user prompts and response inputs to fine tune your own LLM.
* Automate data collection and summarization over image documents.
* Create a RLHF (Reinforcement Learning from Human Feedback) loop to improve the LLM's performance.
* Evaluate the LLM's performance.
Check the [Generative AI templates](https://labelstud.io/templates/gallery_generative_ai) section for more examples.
## Before you begin
Before you begin, you must install the [Label Studio ML backend](https://github.com/HumanSignal/label-studio-ml-backend?tab=readme-ov-file#quickstart).
This tutorial uses the [`llm_interactive` example](https://github.com/HumanSignal/label-studio-ml-backend/tree/master/label_studio_ml/examples/llm_interactive).
## Quickstart
1. Build and start the Machine Learning backend on `http://localhost:9090`
```bash
docker-compose up
```
2. Check if it works:
```bash
$ curl http://localhost:9090/health
{"status":"UP"}
```
3. Open a Label Studio project and go to **Settings > Model**. [Connect the model](https://labelstud.io/guide/ml#Connect-the-model-to-Label-Studio), specifying `http://localhost:9090` as the URL.
Ensure the **Interactive preannotations** toggle is enabled and click **Validate and Save**.
4. The project config should be compatible with the ML backend. This ML backend can support various input data formats
like plain text, hypertext, images, and structured dialogs. To ensure the project config is compatible, follow these
rules:
- The project should contain at least one `