官方 SDK 接入
下面收录常见官方 SDK / 框架的「一行配置」接入方式。所有示例都假设环境变量 TTT_KEY=sk-xxx 已设置。
Python
openai
from openai import OpenAI
client = OpenAI(
api_key=os.environ["TTT_KEY"],
base_url="https://tttoken.xyz/v1",
)
resp = client.chat.completions.create(
model="gpt-4o",
messages=[{"role":"user","content":"hi"}],
)
anthropic
from anthropic import Anthropic
client = Anthropic(
api_key=os.environ["TTT_KEY"],
base_url="https://tttoken.xyz",
)
msg = client.messages.create(
model="claude-sonnet-4-5",
max_tokens=512,
messages=[{"role":"user","content":"hi"}],
)
google-genai
from google import genai
client = genai.Client(
api_key=os.environ["TTT_KEY"],
http_options={"base_url": "https://tttoken.xyz"},
)
r = client.models.generate_content(
model="gemini-2.5-pro",
contents="hi",
)
LangChain
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
model="gpt-4o",
api_key=os.environ["TTT_KEY"],
base_url="https://tttoken.xyz/v1",
)
# Claude:
from langchain_anthropic import ChatAnthropic
llm = ChatAnthropic(
model="claude-sonnet-4-5",
api_key=os.environ["TTT_KEY"],
base_url="https://tttoken.xyz",
)
LlamaIndex
from llama_index.llms.openai import OpenAI as LI_OpenAI
llm = LI_OpenAI(
model="gpt-4o",
api_key=os.environ["TTT_KEY"],
api_base="https://tttoken.xyz/v1",
)
Node.js / TypeScript
openai
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.TTT_KEY,
baseURL: "https://tttoken.xyz/v1",
});
@anthropic-ai/sdk
import Anthropic from "@anthropic-ai/sdk";
const client = new Anthropic({
apiKey: process.env.TTT_KEY,
baseURL: "https://tttoken.xyz",
});
@google/genai
import { GoogleGenAI } from "@google/genai";
const ai = new GoogleGenAI({
apiKey: process.env.TTT_KEY,
httpOptions: { baseUrl: "https://tttoken.xyz" },
});
Vercel AI SDK
import { createOpenAI } from "@ai-sdk/openai";
import { generateText } from "ai";
const openai = createOpenAI({
apiKey: process.env.TTT_KEY,
baseURL: "https://tttoken.xyz/v1",
});
const { text } = await generateText({
model: openai("gpt-4o"),
prompt: "hi",
});
Go
sashabaranov/go-openai
cfg := openai.DefaultConfig(os.Getenv("TTT_KEY"))
cfg.BaseURL = "https://tttoken.xyz/v1"
cli := openai.NewClientWithConfig(cfg)
Rust
use async_openai::{Client, config::OpenAIConfig};
let cfg = OpenAIConfig::new()
.with_api_key(std::env::var("TTT_KEY").unwrap())
.with_api_base("https://tttoken.xyz/v1");
let client = Client::with_config(cfg);
Java
OpenAiService service = new OpenAiService(
"sk-xxxxxxxxxx",
Duration.ofSeconds(60),
new OkHttpClient().newBuilder().build(),
new Retrofit.Builder()
.baseUrl("https://tttoken.xyz/v1/")
// ... rest
);
使用技巧
- 绝大多数 SDK 读取
OPENAI_BASE_URL/ANTHROPIC_BASE_URL环境变量,优先级 ≥ 代码参数。 - 需要分组时加请求头:
extraHeaders: { "X-Group": "vip" }。 - 流式读超时(含长思考)建议把
timeout放宽到 600s。