OpenAI
Chat Completions API
Call OpenAI models using the Chat Completions API.
Get Your API Key
Go to the API Keys page to create your API key.
API Base URL
https://tokenoff.com/apiOpenAI models support the /v1/chat/completions endpoint, fully compatible with the OpenAI Chat Completions API.
curl
Blocking Call
curl https://tokenoff.com/api/v1/chat/completions \
-H "Authorization: Bearer your_api_key_here" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-5.4",
"messages": [
{"role": "user", "content": "Hello"}
]
}'
Output:
{"id":"resp_07cc003c4736c70b0069e0c1e41c2c8196952f2f53524e1891","object":"chat.completion","created":1776337380,"model":"gpt-4.1","choices":[{"index":0,"message":{"role":"assistant","content":"Hello! How can I help you today?"},"finish_reason":"stop"}],"usage":{"prompt_tokens":7,"completion_tokens":13,"total_tokens":20}}Streaming Response
curl https://tokenoff.com/api/v1/chat/completions \
-H "Authorization: Bearer your_api_key_here" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-5.4",
"messages": [
{"role": "user", "content": "Hello"}
],
"stream": true
}'
Output:
data: {"id":"resp_01570884e7e6265a0069e0c260ed9c8194bd120e896b171efb","object":"chat.completion.chunk","created":1776337504,"model":"gpt-4.1","choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":null}]}
data: {"id":"resp_01570884e7e6265a0069e0c260ed9c8194bd120e896b171efb","object":"chat.completion.chunk","created":1776337504,"model":"gpt-4.1","choices":[{"index":0,"delta":{"content":"Hello"},"finish_reason":null}]}
data: {"id":"resp_01570884e7e6265a0069e0c260ed9c8194bd120e896b171efb","object":"chat.completion.chunk","created":1776337504,"model":"gpt-4.1","choices":[{"index":0,"delta":{"content":"!"},"finish_reason":null}]}
data: {"id":"resp_01570884e7e6265a0069e0c260ed9c8194bd120e896b171efb","object":"chat.completion.chunk","created":1776337504,"model":"gpt-4.1","choices":[{"index":0,"delta":{"content":" How"},"finish_reason":null}]}
data: {"id":"resp_01570884e7e6265a0069e0c260ed9c8194bd120e896b171efb","object":"chat.completion.chunk","created":1776337504,"model":"gpt-4.1","choices":[{"index":0,"delta":{"content":" can"},"finish_reason":null}]}
data: {"id":"resp_01570884e7e6265a0069e0c260ed9c8194bd120e896b171efb","object":"chat.completion.chunk","created":1776337504,"model":"gpt-4.1","choices":[{"index":0,"delta":{"content":" I"},"finish_reason":null}]}
data: {"id":"resp_01570884e7e6265a0069e0c260ed9c8194bd120e896b171efb","object":"chat.completion.chunk","created":1776337504,"model":"gpt-4.1","choices":[{"index":0,"delta":{"content":" help"},"finish_reason":null}]}
data: {"id":"resp_01570884e7e6265a0069e0c260ed9c8194bd120e896b171efb","object":"chat.completion.chunk","created":1776337504,"model":"gpt-4.1","choices":[{"index":0,"delta":{"content":" you"},"finish_reason":null}]}
data: {"id":"resp_01570884e7e6265a0069e0c260ed9c8194bd120e896b171efb","object":"chat.completion.chunk","created":1776337504,"model":"gpt-4.1","choices":[{"index":0,"delta":{"content":" today"},"finish_reason":null}]}
data: {"id":"resp_01570884e7e6265a0069e0c260ed9c8194bd120e896b171efb","object":"chat.completion.chunk","created":1776337504,"model":"gpt-4.1","choices":[{"index":0,"delta":{"content":"?"},"finish_reason":null}]}
data: {"id":"resp_01570884e7e6265a0069e0c260ed9c8194bd120e896b171efb","object":"chat.completion.chunk","created":1776337504,"model":"gpt-4.1","choices":[{"index":0,"delta":{},"finish_reason":"stop"}],"usage":{"prompt_tokens":7,"completion_tokens":13,"total_tokens":20}}
data: [DONE]Python
pip install openaiBlocking Call
from openai import OpenAI
client = OpenAI(
api_key="your_api_key_here",
base_url="https://tokenoff.com/api/v1",
)
response = client.chat.completions.create(
model="gpt-5.4",
messages=[
{"role": "user", "content": "Hello"}
],
)
print(response.choices[0].message.content)
Output:
Hello! How can I help?Streaming Response
from openai import OpenAI
client = OpenAI(
api_key="your_api_key_here",
base_url="https://tokenoff.com/api/v1",
)
response = client.chat.completions.create(
model="gpt-5.4",
messages=[
{"role": "user", "content": "Hello"}
],
stream=True,
)
for event in response:
print(event)
Output:
ChatCompletionChunk(id='resp_02f00b38518533fd0069e0c3f104fc819484590784cd3b0aa7', choices=[Choice(delta=ChoiceDelta(content='', function_call=None, refusal=None, role='assistant', tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1776337905, model='gpt-4.1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None)
ChatCompletionChunk(id='resp_02f00b38518533fd0069e0c3f104fc819484590784cd3b0aa7', choices=[Choice(delta=ChoiceDelta(content='Hello', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1776337905, model='gpt-4.1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None)
ChatCompletionChunk(id='resp_02f00b38518533fd0069e0c3f104fc819484590784cd3b0aa7', choices=[Choice(delta=ChoiceDelta(content='!', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1776337905, model='gpt-4.1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None)
ChatCompletionChunk(id='resp_02f00b38518533fd0069e0c3f104fc819484590784cd3b0aa7', choices=[Choice(delta=ChoiceDelta(content=' How', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1776337905, model='gpt-4.1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None)
ChatCompletionChunk(id='resp_02f00b38518533fd0069e0c3f104fc819484590784cd3b0aa7', choices=[Choice(delta=ChoiceDelta(content=' can', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1776337905, model='gpt-4.1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None)
ChatCompletionChunk(id='resp_02f00b38518533fd0069e0c3f104fc819484590784cd3b0aa7', choices=[Choice(delta=ChoiceDelta(content=' I', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1776337905, model='gpt-4.1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None)
ChatCompletionChunk(id='resp_02f00b38518533fd0069e0c3f104fc819484590784cd3b0aa7', choices=[Choice(delta=ChoiceDelta(content=' help', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1776337905, model='gpt-4.1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None)
ChatCompletionChunk(id='resp_02f00b38518533fd0069e0c3f104fc819484590784cd3b0aa7', choices=[Choice(delta=ChoiceDelta(content='?', function_call=None, refusal=None, role=None, tool_calls=None), finish_reason=None, index=0, logprobs=None)], created=1776337905, model='gpt-4.1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None)
ChatCompletionChunk(id='resp_02f00b38518533fd0069e0c3f104fc819484590784cd3b0aa7', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role=None, tool_calls=None), finish_reason='stop', index=0, logprobs=None)], created=1776337905, model='gpt-4.1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=CompletionUsage(completion_tokens=11, prompt_tokens=7, total_tokens=18, completion_tokens_details=None, prompt_tokens_details=None))TypeScript
npm install openaiBlocking Call
import OpenAI from "openai";
const client = new OpenAI({
apiKey: "your_api_key_here",
baseURL: "https://tokenoff.com/api/v1",
});
const response = await client.chat.completions.create({
model: "gpt-5.4",
messages: [
{ role: "user", content: "Hello" },
],
});
console.log(response.choices[0]?.message.content);
Output:
Hello! How can I help you today?Streaming Response
import OpenAI from "openai";
const client = new OpenAI({
apiKey: "your_api_key_here",
baseURL: "https://tokenoff.com/api/v1",
});
const response = await client.chat.completions.create({
model: "gpt-5.4",
messages: [
{ role: "user", content: "Hello" },
],
stream: true,
});
for await (const event of response) {
console.log(event);
}
Output:
{
id: 'resp_09288c4e32022db60069e0c4918cec819699f22a80c5900655',
object: 'chat.completion.chunk',
created: 1776338065,
model: 'gpt-4.1',
choices: [ { index: 0, delta: [Object], finish_reason: null } ]
}
{
id: 'resp_09288c4e32022db60069e0c4918cec819699f22a80c5900655',
object: 'chat.completion.chunk',
created: 1776338065,
model: 'gpt-4.1',
choices: [ { index: 0, delta: [Object], finish_reason: null } ]
}
{
id: 'resp_09288c4e32022db60069e0c4918cec819699f22a80c5900655',
object: 'chat.completion.chunk',
created: 1776338065,
model: 'gpt-4.1',
choices: [ { index: 0, delta: [Object], finish_reason: null } ]
}
{
id: 'resp_09288c4e32022db60069e0c4918cec819699f22a80c5900655',
object: 'chat.completion.chunk',
created: 1776338065,
model: 'gpt-4.1',
choices: [ { index: 0, delta: [Object], finish_reason: null } ]
}
{
id: 'resp_09288c4e32022db60069e0c4918cec819699f22a80c5900655',
object: 'chat.completion.chunk',
created: 1776338065,
model: 'gpt-4.1',
choices: [ { index: 0, delta: [Object], finish_reason: null } ]
}
{
id: 'resp_09288c4e32022db60069e0c4918cec819699f22a80c5900655',
object: 'chat.completion.chunk',
created: 1776338065,
model: 'gpt-4.1',
choices: [ { index: 0, delta: [Object], finish_reason: null } ]
}
{
id: 'resp_09288c4e32022db60069e0c4918cec819699f22a80c5900655',
object: 'chat.completion.chunk',
created: 1776338065,
model: 'gpt-4.1',
choices: [ { index: 0, delta: [Object], finish_reason: null } ]
}
{
id: 'resp_09288c4e32022db60069e0c4918cec819699f22a80c5900655',
object: 'chat.completion.chunk',
created: 1776338065,
model: 'gpt-4.1',
choices: [ { index: 0, delta: [Object], finish_reason: null } ]
}
{
id: 'resp_09288c4e32022db60069e0c4918cec819699f22a80c5900655',
object: 'chat.completion.chunk',
created: 1776338065,
model: 'gpt-4.1',
choices: [ { index: 0, delta: [Object], finish_reason: null } ]
}
{
id: 'resp_09288c4e32022db60069e0c4918cec819699f22a80c5900655',
object: 'chat.completion.chunk',
created: 1776338065,
model: 'gpt-4.1',
choices: [ { index: 0, delta: [Object], finish_reason: null } ]
}
{
id: 'resp_09288c4e32022db60069e0c4918cec819699f22a80c5900655',
object: 'chat.completion.chunk',
created: 1776338065,
model: 'gpt-4.1',
choices: [ { index: 0, delta: {}, finish_reason: 'stop' } ],
usage: { prompt_tokens: 7, completion_tokens: 13, total_tokens: 20 }
}Supported Models
gpt-5.4
GPT-5.4
gpt-5.4-mini
GPT-5.4 mini
gpt-5.4-nano
GPT-5.4 nano
gpt-5.2
GPT-5.2
gpt-5.1
GPT-5.1
gpt-5
GPT-5
gpt-5-mini
GPT-5 mini
gpt-5-nano
GPT-5 nano
gpt-4.1
GPT-4.1
gpt-4.1-mini
GPT-4.1 mini
gpt-4.1-nano
GPT-4.1 nano
gpt-4o
GPT-4o
gpt-4o-mini
GPT-4o mini
o4-mini
o4-mini
o3
o3
o3-mini
o3-mini
o3-pro
o3-pro
gpt-5.3-codex
GPT-5.3-Codex
gpt-5.3-codex-spark
GPT-5.3-Codex-Spark
gpt-5.2-codex
GPT-5.2-Codex
gpt-5.1-codex-max
GPT-5.1-Codex-Max
gpt-5.1-codex
GPT-5.1-Codex
gpt-5-codex
GPT-5-Codex
gpt-5.1-codex-mini
GPT-5.1-Codex-Mini
Contact Us
If you encounter any issues while using TokenOff:
Contact us atsupport@tokenoff.comand other official channels for technical support
Submit an issue on ourGithubrepository