七牛云作为一家一站式场景化智能音视频 APaaS 平台,深谙用户对灵活性和便捷性的需求。我们为您提供了两种使用方式:
只需调用七牛云的公开或私有 API 接口(兼容openai库),就可以立即开始使用 DeepSeek-R1/V3 的 AI 功能。无需额外下载软件,无需复杂配置,快速集成到你的应用中。
使用上一步获取的LLM API KEY调用chat接口,支持模型:"deepseek-r1"和"deepseek-v3"。例如:
# 调用文本摘要API
export LLM_API_KEY="<你的 LLM API KEY>"
curl <https://api.qnaigc.com/v1/chat/completions> \\
-H "Content-Type: application/json" \\
-H "Authorization: Bearer $LLM_API_KEY" \\
-d '{
"messages": [{"role": "user", "content": "七牛云提供 GPU 云产品能用于哪些场景?"}],
"model": "deepseek-v3",
"stream": true
}'
from openai import OpenAI
url = '<https://api.qnaigc.com/v1/>'
llm_api_key = 'your llm_api_key'
client = OpenAI(
base_url=url,
api_key=llm_api_key
)
# 发送带有流式输出的请求
content = ""
messages = [
{"role": "user", "content": "七牛云提供 GPU 云产品能用于哪些场景?"}
]
response = client.chat.completions.create(
model="deepseek-v3",
messages=messages,
stream=True, # 启用流式输出
max_tokens=4096
)
# 逐步接收并处理响应
for chunk in response:
if chunk.choices[0].delta.content:
content += chunk.choices[0].delta.content
print(content)
# Round 2
messages.append({"role": "assistant", "content": content})
messages.append({'role': 'user', 'content': "继续"})
response = client.chat.completions.create(
model="deepseek-v3",
messages=messages,
stream=True
)
for chunk in response:
if chunk.choices[0].delta.content:
content += chunk.choices[0].delta.content
print(content)
from openai import OpenAI
url = '<https://api.qnaigc.com/v1/>'
llm_api_key = 'your llm_api_key'
client = OpenAI(
base_url=url,
api_key=llm_api_key
)
# 发送非流式输出的请求
messages = [
{"role": "user", "content": "七牛云提供 GPU 云产品能用于哪些场景?"}
]
response = client.chat.completions.create(
model="deepseek-v3",
messages=messages,
stream=False,
max_tokens=4096
)
content = response.choices[0].message.content
print(content)
# Round 2
messages.append({"role": "assistant", "content": content})
messages.append({'role': 'user', 'content': "继续"})
response = client.chat.completions.create(
model="deepseek-v3",
messages=messages,
stream=False
)
content = response.choices[0].message.content
print(content)
API 支持联网功能,为保持对 openai sdk 的兼容,联网功能通过在模型后面加”?search”来开启。例如:
# 调用文本摘要API
export LLM_API_KEY="<你的 LLM API KEY>"
curl <https://api.qnaigc.com/v1/chat/completions> \\
-H "Content-Type: application/json" \\
-H "Authorization: Bearer $LLM_API_KEY" \\
-d '{
"messages": [{"role": "user", "content": "七牛云提供 GPU 云产品能用于哪些场景?"}],
"model": "deepseek-v3?search"
}'