-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(community): Perplexity integration #7817
base: main
Are you sure you want to change the base?
Conversation
anadi45
commented
Mar 8, 2025
- Adds support for perplexity client.
The latest updates on your projects. Learn more about Vercel for Git ↗︎
1 Skipped Deployment
|
return "ChatPerplexity"; | ||
} | ||
|
||
modelName = "llama-3.1-sonar-small-128k-online"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's not take a default here
Also, we are standardizing on model
over modelName
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks so much for this!
We will also want to use our standard template for docs
I am happy to take over and land this this week if you don't have time
|
||
modelName = "llama-3.1-sonar-small-128k-online"; | ||
|
||
temperature = 0.2; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Prefer not setting defaults
|
||
streaming = false; | ||
|
||
topP = 0.9; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Prefer not setting defaults
@@ -432,6 +434,7 @@ export const config = { | |||
"chat_models/bedrock", | |||
"chat_models/bedrock/web", | |||
"chat_models/llama_cpp", | |||
"chat_models/perplexity", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You don't need this since it doesn't use any optional deps
yield generationChunk; | ||
|
||
// Emit the chunk to the callback manager if provided | ||
if (_runManager) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: don't prefix with _
since this is used in the function
const response = await this.client.chat.completions.create({ | ||
messages: messagesList, | ||
...this.invocationParams(), | ||
stream: false, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We do also support calling .invoke
/.generate
with stream: true
and aggregating
This is niche but convenient if you want to use the final output while also streaming back tokens