ModelBridge provides a fully OpenAI-compatible API. If you've used the OpenAI SDK before, you already know how to use us — just change the base URL.
https://aibridge-api.com/v1
All requests require an API key in the Authorization header:
Authorization: Bearer mb-xxxxxxxxxxxxx
| Method | Endpoint | Description |
|---|---|---|
| GET | /v1/models | List available models |
| POST | /v1/chat/completions | Chat completion (supports streaming) |
| POST | /v1/embeddings | Generate text embeddings |
| GET | /health | Service health check |
| Model ID | Type | Context Window | Best For |
|---|---|---|---|
deepseek-chat | Chat | 64K | General purpose conversations, coding |
deepseek-reasoner | Reasoning | 64K | Complex reasoning, math, logic |
qwen-max | Chat | 32K | Multilingual tasks, long context |
qwen-plus | Chat | 131K | Cost-effective general usage |
The primary endpoint for generating text responses. Supports both regular and streaming (SSE) modes.
from openai import OpenAI client = OpenAI( api_key="YOUR_API_KEY", base_url="https://aibridge-api.com/v1", ) response = client.chat.completions.create( model="deepseek-chat", messages=[ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "Hello!"}, ], ) print(response.choices[0].message.content)
Set stream: true to receive tokens as they are generated:
response = client.chat.completions.create(
model="deepseek-chat",
messages=[{"role": "user", "content": "Tell me a story."}],
stream=True,
)
for chunk in response:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
| Status Code | Error | Cause | Solution |
|---|---|---|---|
| 401 | Unauthorized | Invalid or missing API key | Check your Authorization header |
| 404 | Not Found | Invalid endpoint or model | Verify the URL and model name |
| 429 | Rate Limited | Too many requests | Reduce frequency; upgrade plan if needed |
| 500 | Internal Error | Upstream service issue | Retry after a few seconds |
| 502 | Bad Gateway | Upstream unavailable | Upstream AI provider is down |
| 504 | Gateway Timeout | Request took too long | Try with shorter prompts |