在LLM应用的开发中,开发者常常面临着选择合适的大型语言模型(LLM)和集成多样化工具的挑战。现在,有了promptulate
库中的pne.chat()
函数,这一切变得前所未有的简单。本文将介绍如何使用pne.chat()
来构建强大而灵活的LLM应用。
pne.chat()
是promptulate
库的核心功能之一,它集成了多种LLM的能力,包括OpenAI、Anthropic、Cohere等超过100种不同的LLM。这意味着开发者可以通过一个统一的接口,调用不同的LLM API,极大地简化了开发流程。
使用pne.chat()
进行对话就像与OpenAI的GPT模型聊天一样简单。你只需要准备好对话消息,pne.chat()
会自动处理格式转换,并返回响应。
import promptulate as pne
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Who are you?"},
]
response: str = pne.chat(messages=messages)
print(response)
I am a helpful assistant designed to assist you with any questions or tasks you may have. How can I assist you today?
如果你需要更复杂的功能,元数据是非常重要的。通过设置return_raw_response=True
,你可以获取原始响应,其中包含了元数据。
import promptulate as pne
response = pne.chat(
messages="When is your knowledge up to?",
model="gpt-4-1106-preview"
)
print(response)
My knowledge is up to date as of March 2021. Any events or developments occurring after that date would not be included in my responses. If you're asking for any recent information or updates, I recommend checking the latest sources as my information might not be current.
pne.chat()
支持调用超过100种LLM,你只需要提供相应的API密钥即可。以下是使用Anthropic的claude-2
模型的示例:
import os
import promptulate as pne
os.environ["ANTHROPIC_API_KEY"] = "你的API密钥"
response = pne.chat(messages=messages, model="claude-2")
print(response)
在构建复杂的Agent项目时,输出格式化是提高系统稳健性的必要措施。pne.chat()
可以返回格式化的对象,这对于确保输出的一致性和可用性至关重要。
from typing import List
import promptulate as pne
from pydantic import BaseModel, Field
class LLMResponse(BaseModel):
provinces: List[str] = Field(description="中国所有省份")
response: LLMResponse = pne.chat(
messages="请告诉我中国所有的省份。",
output_schema=LLMResponse
)
print(response.provinces)
I am an AI assistant here to help you with any questions or tasks you may have. How can I assist you today?
{'id': 'chatcmpl-8UK0tfwlkixWyaxKJ2XWNGMVGFPo0', 'choices': [{'finish_reason': 'stop', 'index': 0, 'message': {'content': 'I am an AI assistant here to help you with any questions or tasks you may have. How can I assist you today?', 'role': 'assistant'}}], 'created': 1702237461, 'model': 'gpt-3.5-turbo-0613', 'object': 'chat.completion', 'system_fingerprint': None, 'usage': {'completion_tokens': 25, 'prompt_tokens': 20, 'total_tokens': 45}, '_response_ms': 2492.372}
pne.chat()
还支持流式响应,这意味着你可以实时与你的助手进行交流。
import promptulate as pne
response = pne.chat("你是谁?", stream=True)
for chunk in response:
print(chunk)
pne.chat()
作为一个一站式的AI对话开发解决方案,为开发者提供了前所未有的便利性和强大功能。无论是想要快速集成LLM,还是需要构建复杂的对话系统,pne.chat()
都能大大降低开发难度,提高开发效率。随着promptulate
未来对工具和检索方法的持续扩展,pne.chat()
将成为LLM应用开发的不二选择。
原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。
如有侵权,请联系 cloudcommunity@tencent.com 删除。
原创声明:本文系作者授权腾讯云开发者社区发表,未经许可,不得转载。
如有侵权,请联系 cloudcommunity@tencent.com 删除。