From c6a714b4d85c8379070c773f77ce2e16c89b1b53 Mon Sep 17 00:00:00 2001 From: dloman118 <99347459+dloman118@users.noreply.github.com> Date: Thu, 13 Jun 2024 11:10:27 -0400 Subject: [PATCH] fix broken link errors --- .../Function-Calling-101-Ecommerce.ipynb | 2 +- .../rag-langchain-presidential-speeches.ipynb | 3 ++- .../conversational-chatbot-langchain/README.md | 4 ++++ .../Groq/groq-example-templates/crewai-agents/README.md | 7 +++++-- .../groq-quickstart-conversational-chatbot/README.md | 4 ++++ .../README.md | 4 ++++ .../README.md | 4 ++++ .../presidential-speeches-rag-with-pinecone/README.md | 4 ++++ .../groq-example-templates/text-to-sql-json-mode/README.md | 4 ++++ .../verified-sql-function-calling/README.md | 4 ++++ 10 files changed, 36 insertions(+), 4 deletions(-) diff --git a/recipes/llama_api_providers/Groq/groq-api-cookbook/function-calling-101-ecommerce/Function-Calling-101-Ecommerce.ipynb b/recipes/llama_api_providers/Groq/groq-api-cookbook/function-calling-101-ecommerce/Function-Calling-101-Ecommerce.ipynb index 43f8b639..4b605200 100644 --- a/recipes/llama_api_providers/Groq/groq-api-cookbook/function-calling-101-ecommerce/Function-Calling-101-Ecommerce.ipynb +++ b/recipes/llama_api_providers/Groq/groq-api-cookbook/function-calling-101-ecommerce/Function-Calling-101-Ecommerce.ipynb @@ -328,7 +328,7 @@ "source": [ "The two key parameters we need to include in our chat completion are `tools=tools` and `tool_choice=\"auto\"`, which provides the model with the available tools we've just defined and tells it to use one if appropriate (`tool_choice=\"auto\"` gives the LLM the option of using any, all or none of the available functions. To mandate a specific function call, we could use `tool_choice={\"type\": \"function\", \"function\": {\"name\":\"create_order\"}}`). \n", "\n", - "When the LLM decides to use a tool, the response is *not* a conversational chat, but . From there, we can execute the LLM-identified tool with the LLM-identified parameters, and feed the response *back* to the LLM for a second request so that it can respond with appropriate context from the tool it just used:" + "When the LLM decides to use a tool, the response is *not* a conversational chat, but a JSON object containing the tool choice and tool parameters. From there, we can execute the LLM-identified tool with the LLM-identified parameters, and feed the response *back* to the LLM for a second request so that it can respond with appropriate context from the tool it just used:" ] }, { diff --git a/recipes/llama_api_providers/Groq/groq-api-cookbook/rag-langchain-presidential-speeches/rag-langchain-presidential-speeches.ipynb b/recipes/llama_api_providers/Groq/groq-api-cookbook/rag-langchain-presidential-speeches/rag-langchain-presidential-speeches.ipynb index ac6e13b2..30d2de4c 100644 --- a/recipes/llama_api_providers/Groq/groq-api-cookbook/rag-langchain-presidential-speeches/rag-langchain-presidential-speeches.ipynb +++ b/recipes/llama_api_providers/Groq/groq-api-cookbook/rag-langchain-presidential-speeches/rag-langchain-presidential-speeches.ipynb @@ -57,11 +57,12 @@ ] }, { + "attachments": {}, "cell_type": "markdown", "id": "4c18688b-178f-439d-90a4-590f99ade11f", "metadata": {}, "source": [ - "A Groq API Key is required for this demo - you can generate one for free [here](https://console.groq.com/keys). We will be using Pinecone as our vector database, which also requires an API key (you can create one index for a small project there for free on their Starter plan), but will also show how it works with [Chroma DB](https://www.trychroma.com/), a free open source alternative that stores vector embeddings in memory. We will also use the Llama3 8b model for this demo." + "A Groq API Key is required for this demo - you can generate one for free [here](https://console.groq.com/). We will be using Pinecone as our vector database, which also requires an API key (you can create one index for a small project there for free on their Starter plan), but will also show how it works with [Chroma DB](https://www.trychroma.com/), a free open source alternative that stores vector embeddings in memory. We will also use the Llama3 8b model for this demo." ] }, { diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/conversational-chatbot-langchain/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/conversational-chatbot-langchain/README.md index aa2b40f9..f59ecf87 100644 --- a/recipes/llama_api_providers/Groq/groq-example-templates/conversational-chatbot-langchain/README.md +++ b/recipes/llama_api_providers/Groq/groq-example-templates/conversational-chatbot-langchain/README.md @@ -12,6 +12,10 @@ A simple application that allows users to interact with a conversational chatbot ## Usage +<!-- markdown-link-check-disable --> + You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys). +<!-- markdown-link-check-enable --> + You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Chatbot-with-Conversational-Memory-on-LangChain) or run it on the command line with `python main.py` diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/crewai-agents/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/crewai-agents/README.md index cd41e2ee..ad8c4848 100644 --- a/recipes/llama_api_providers/Groq/groq-example-templates/crewai-agents/README.md +++ b/recipes/llama_api_providers/Groq/groq-example-templates/crewai-agents/README.md @@ -12,9 +12,12 @@ The [CrewAI](https://docs.crewai.com/) Machine Learning Assistant is a command l - **LangChain Integration**: Incorporates LangChain to facilitate natural language processing and enhance the interaction between the user and the machine learning assistant. - ## Usage +<!-- markdown-link-check-disable --> + You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys). -You can [fork and run this application on Replit](https://replit.com/@GroqCloud/CrewAI-Machine-Learning-Assistant) or run it on the command line with `python main.py`. You can upload a sample .csv to the same directory as ```main.py``` to give the application a head start on your ML problem. The application will output a Markdown file including python code for your ML use case to the same directory as main.py. +<!-- markdown-link-check-enable --> + +You can [fork and run this application on Replit](https://replit.com/@GroqCloud/CrewAI-Machine-Learning-Assistant) or run it on the command line with `python main.py`. You can upload a sample .csv to the same directory as `main.py` to give the application a head start on your ML problem. The application will output a Markdown file including python code for your ML use case to the same directory as main.py. diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/groq-quickstart-conversational-chatbot/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/groq-quickstart-conversational-chatbot/README.md index d40feb52..da196d47 100644 --- a/recipes/llama_api_providers/Groq/groq-example-templates/groq-quickstart-conversational-chatbot/README.md +++ b/recipes/llama_api_providers/Groq/groq-example-templates/groq-quickstart-conversational-chatbot/README.md @@ -12,6 +12,10 @@ A simple application that allows users to interact with a conversational chatbot ## Usage +<!-- markdown-link-check-disable --> + You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys). +<!-- markdown-link-check-enable --> + You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Groq-Quickstart-Conversational-Chatbot) or run it on the command line with `python main.py`. diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/groqing-the-stock-market-function-calling-llama3/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/groqing-the-stock-market-function-calling-llama3/README.md index c2f9879b..750752cd 100644 --- a/recipes/llama_api_providers/Groq/groq-example-templates/groqing-the-stock-market-function-calling-llama3/README.md +++ b/recipes/llama_api_providers/Groq/groq-example-templates/groqing-the-stock-market-function-calling-llama3/README.md @@ -18,6 +18,10 @@ The function calling in this application is handled by the Groq API, abstracted ## Usage +<!-- markdown-link-check-disable --> + You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys). +<!-- markdown-link-check-enable --> + You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Groqing-the-Stock-Market-Function-Calling-with-Llama3) or run it on the command line with `python main.py`. diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/llamachat-conversational-chatbot-with-llamaIndex/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/llamachat-conversational-chatbot-with-llamaIndex/README.md index dd7b9067..d2b91250 100644 --- a/recipes/llama_api_providers/Groq/groq-example-templates/llamachat-conversational-chatbot-with-llamaIndex/README.md +++ b/recipes/llama_api_providers/Groq/groq-example-templates/llamachat-conversational-chatbot-with-llamaIndex/README.md @@ -14,4 +14,8 @@ A simple application that allows users to interact with a conversational chatbot ##Usage +<!-- markdown-link-check-disable --> + You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys). + +<!-- markdown-link-check-enable --> diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/presidential-speeches-rag-with-pinecone/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/presidential-speeches-rag-with-pinecone/README.md index b7b756cf..9e8aa7d6 100644 --- a/recipes/llama_api_providers/Groq/groq-example-templates/presidential-speeches-rag-with-pinecone/README.md +++ b/recipes/llama_api_providers/Groq/groq-example-templates/presidential-speeches-rag-with-pinecone/README.md @@ -22,8 +22,12 @@ The main script of the application is [main.py](./main.py). Here's a brief overv ## Usage +<!-- markdown-link-check-disable --> + You will need to store a valid Groq API Key as a secret to proceed with this example outside of this Repl. You can generate one for free [here](https://console.groq.com/keys). +<!-- markdown-link-check-enable --> + You would also need your own [Pinecone](https://www.pinecone.io/) index with presidential speech embeddings to run this code locally. You can create a Pinecone API key and one index for a small project for free on their Starter plan, and visit [this Cookbook post](https://github.com/groq/groq-api-cookbook/blob/dan/replit-conversion/presidential-speeches-rag/presidential-speeches-rag.ipynb) for more info on RAG and a guide to uploading these embeddings to a vector database You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Presidential-Speeches-RAG-with-Pinecone) or run it on the command line with `python main.py`. diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/text-to-sql-json-mode/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/text-to-sql-json-mode/README.md index 3d97eb57..7e0451cd 100644 --- a/recipes/llama_api_providers/Groq/groq-example-templates/text-to-sql-json-mode/README.md +++ b/recipes/llama_api_providers/Groq/groq-example-templates/text-to-sql-json-mode/README.md @@ -38,8 +38,12 @@ A well-crafted system prompt is essential for building a functional Text-to-SQL ## Usage +<!-- markdown-link-check-disable --> + You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys). +<!-- markdown-link-check-enable --> + You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Building-a-Text-to-SQL-app-with-Groqs-JSON-mode) or run it on the command line with `python main.py`. ## Customizing with Your Own Data diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/verified-sql-function-calling/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/verified-sql-function-calling/README.md index 61537fa1..94f151f4 100644 --- a/recipes/llama_api_providers/Groq/groq-example-templates/verified-sql-function-calling/README.md +++ b/recipes/llama_api_providers/Groq/groq-example-templates/verified-sql-function-calling/README.md @@ -36,8 +36,12 @@ The verified SQL queries and their descriptions are stored in YAML files located ## Usage +<!-- markdown-link-check-disable --> + You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys). +<!-- markdown-link-check-enable --> + You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Execute-Verified-SQL-Queries-with-Function-Calling) or run it on the command line with `python main.py`. ## Customizing with Your Own Data -- GitLab