Quickstart
Let's discover LLMrouter.eu Quickstart in less than 5 minutes.
LLMrouter.eu is a secure gateway for managing Large Language Models (LLMs) from the EU, designed for organizations that require data privacy, cost optimization, and EU compliance. It helps avoid vendor lock-in, ensures data protection, and enables integration of the latest AI technologies into business workflows. LLMrouter is ideal for scalable, compliant AI solutions in regulated environments.
Get started with just some lines of code or usage of your preferred SDK.
Using the LLMrouter.eu API directly
You can use the LLMrouter.eu API directly with any HTTP client, as its endpoints are fully OpenAI API compatible. Simply point your requests to https://proxy.llmrouter.eu/v1 and use your API key for authentication.
Replace YOUR_API_KEY with your actual LLMrouter.eu API key.
- Python
- Javascript
- TypeScript (fetch)
- Curl
import requests
url = "https://proxy.llmrouter.eu/v1/chat/completions"
headers = {
"Authorization": "Bearer YOUR_API_KEY",
"Content-Type": "application/json"
}
data = {
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "Hello, LLMrouter!"
}
]
}
response = requests.post(url, headers=headers, json=data)
print(response.json())
const response =
await fetch('https://proxy.llmrouter.eu/v1/chat/completions', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
model: 'gpt-3.5-turbo',
messages: [
{
role: 'user',
content: 'Hello, LLMrouter!'
}
]
})
});
const result = await response.json();
console.log(result);
fetch('https://proxy.llmrouter.eu/v1/chat/completions', {
method: 'POST',
headers: {
Authorization: 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
},
body: JSON.stringify({
model: 'gpt-3.5-turbo',
messages: [
{
role: 'user',
content: 'Hello, LLMrouter!'
}
]
})
})
.then(response => response.json())
.then(obj => console.log(obj.choices[0].message.content));
curl --location 'https://proxy.llmrouter.eu/v1/chat/completions' \
--header "Authorization: Bearer YOUR_API_KEY" \
--header "Content-Type: application/json" \
--data '{
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "Hello, LLMrouter!"
}
]
}'
Using the OpenAI SDK
- Python
- TypeScript
from openai import OpenAI
client = OpenAI(
base_url="https://proxy.llmrouter.eu/v1",
api_key="YOUR_API_KEY",
)
completion = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[
{
"role": "user",
"content": "Hello, LLMrouter!"
}
]
)
print(completion.choices[0].message.content)
import OpenAI from 'openai';
const openai = new OpenAI({
baseURL: 'https://proxy.llmrouter.eu/v1',
apiKey: 'YOUR_API_KEY'
});
async function main() {
const completion = await openai.chat.completions.create({
model: 'gpt-3.5-turbo',
messages: [
{
role: 'user',
content: 'Hello, LLMrouter!'
}
]
});
console.log(completion.choices[0].message);
}
main();
Using third-party SDKs
- VSCode Copilot - Integration with VSCode Copilot