Skip to content
Snippets Groups Projects
Unverified Commit 57f4f46a authored by Sean Hatfield's avatar Sean Hatfield Committed by GitHub
Browse files

Bump perplexity models (#3014)


* bump perplexity models

---------

Co-authored-by: default avatarTimothy Carambat <rambat1010@gmail.com>
parent 9584a7e1
No related branches found
No related tags found
No related merge requests found
const MODELS = { const MODELS = {
"llama-3.1-sonar-small-128k-online": { "sonar-pro": {
id: "llama-3.1-sonar-small-128k-online", "id": "sonar-pro",
name: "llama-3.1-sonar-small-128k-online", "name": "sonar-pro",
maxLength: 127072, "maxLength": 200000
}, },
"llama-3.1-sonar-large-128k-online": { "sonar": {
id: "llama-3.1-sonar-large-128k-online", "id": "sonar",
name: "llama-3.1-sonar-large-128k-online", "name": "sonar",
maxLength: 127072, "maxLength": 127072
}, }
"llama-3.1-sonar-huge-128k-online": {
id: "llama-3.1-sonar-huge-128k-online",
name: "llama-3.1-sonar-huge-128k-online",
maxLength: 127072,
},
"llama-3.1-sonar-small-128k-chat": {
id: "llama-3.1-sonar-small-128k-chat",
name: "llama-3.1-sonar-small-128k-chat",
maxLength: 131072,
},
"llama-3.1-sonar-large-128k-chat": {
id: "llama-3.1-sonar-large-128k-chat",
name: "llama-3.1-sonar-large-128k-chat",
maxLength: 131072,
},
"llama-3.1-8b-instruct": {
id: "llama-3.1-8b-instruct",
name: "llama-3.1-8b-instruct",
maxLength: 131072,
},
"llama-3.1-70b-instruct": {
id: "llama-3.1-70b-instruct",
name: "llama-3.1-70b-instruct",
maxLength: 131072,
},
}; };
module.exports.MODELS = MODELS; module.exports.MODELS = MODELS;
| Model | Parameter Count | Context Length | Model Type | | Model | Parameter Count | Context Length | Model Type |
| :---------------------------------- | :-------------- | :------------- | :-------------- | | :---------------------------------- | :-------------- | :------------- | :-------------- |
| `llama-3.1-sonar-small-128k-online` | 8B | 127,072 | Chat Completion | | `sonar-pro` | 8B | 200,000 | Chat Completion |
| `llama-3.1-sonar-large-128k-online` | 70B | 127,072 | Chat Completion | | `sonar` | 8B | 127,072 | Chat Completion |
| `llama-3.1-sonar-huge-128k-online` | 405B | 127,072 | Chat Completion | \ No newline at end of file
| `llama-3.1-sonar-small-128k-chat` | 8B | 131,072 | Chat Completion |
| `llama-3.1-sonar-large-128k-chat` | 70B | 131,072 | Chat Completion |
| `llama-3.1-8b-instruct` | 8B | 131,072 | Chat Completion |
| `llama-3.1-70b-instruct` | 70B | 131,072 | Chat Completion |
\ No newline at end of file
...@@ -8,7 +8,12 @@ ...@@ -8,7 +8,12 @@
// copy outputs into the export in ../models.js // copy outputs into the export in ../models.js
// Update the date below if you run this again because Perplexity added new models. // Update the date below if you run this again because Perplexity added new models.
// Last Collected: Sept 12, 2024 // Last Collected: Jan 23, 2025
// UPDATE: Jan 23, 2025
// The table is no longer available on the website, but Perplexity has deprecated the
// old models so now we can just update the chat_models.txt file with the new models
// manually and then run this script to get the new models.
import fs from "fs"; import fs from "fs";
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment