diff --git a/recipes/llama_api_providers/Groq/groq-api-cookbook/function-calling-101-ecommerce/Function-Calling-101-Ecommerce.ipynb b/recipes/llama_api_providers/Groq/groq-api-cookbook/function-calling-101-ecommerce/Function-Calling-101-Ecommerce.ipynb
index 43f8b639d3c319ddee81ea1ffc64282ba22acc4d..4b6052004eef8bc3d6b8cdf2008633b7b39d618d 100644
--- a/recipes/llama_api_providers/Groq/groq-api-cookbook/function-calling-101-ecommerce/Function-Calling-101-Ecommerce.ipynb
+++ b/recipes/llama_api_providers/Groq/groq-api-cookbook/function-calling-101-ecommerce/Function-Calling-101-Ecommerce.ipynb
@@ -328,7 +328,7 @@
    "source": [
     "The two key parameters we need to include in our chat completion are `tools=tools` and `tool_choice=\"auto\"`, which provides the model with the available tools we've just defined and tells it to use one if appropriate (`tool_choice=\"auto\"` gives the LLM the option of using any, all or none of the available functions. To mandate a specific function call, we could use `tool_choice={\"type\": \"function\", \"function\": {\"name\":\"create_order\"}}`). \n",
     "\n",
-    "When the LLM decides to use a tool, the response is *not* a conversational chat, but . From there, we can execute the LLM-identified tool with the LLM-identified parameters, and feed the response *back* to the LLM for a second request so that it can respond with appropriate context from the tool it just used:"
+    "When the LLM decides to use a tool, the response is *not* a conversational chat, but a JSON object containing the tool choice and tool parameters. From there, we can execute the LLM-identified tool with the LLM-identified parameters, and feed the response *back* to the LLM for a second request so that it can respond with appropriate context from the tool it just used:"
    ]
   },
   {
diff --git a/recipes/llama_api_providers/Groq/groq-api-cookbook/rag-langchain-presidential-speeches/rag-langchain-presidential-speeches.ipynb b/recipes/llama_api_providers/Groq/groq-api-cookbook/rag-langchain-presidential-speeches/rag-langchain-presidential-speeches.ipynb
index ac6e13b2390961cb027ff8ff3a3b84c0da06f7d3..30d2de4c34efc303076744c1d1aff18cbcc08a59 100644
--- a/recipes/llama_api_providers/Groq/groq-api-cookbook/rag-langchain-presidential-speeches/rag-langchain-presidential-speeches.ipynb
+++ b/recipes/llama_api_providers/Groq/groq-api-cookbook/rag-langchain-presidential-speeches/rag-langchain-presidential-speeches.ipynb
@@ -57,11 +57,12 @@
    ]
   },
   {
+   "attachments": {},
    "cell_type": "markdown",
    "id": "4c18688b-178f-439d-90a4-590f99ade11f",
    "metadata": {},
    "source": [
-    "A Groq API Key is required for this demo - you can generate one for free [here](https://console.groq.com/keys). We will be using Pinecone as our vector database, which also requires an API key (you can create one index for a small project there for free on their Starter plan), but will also show how it works with [Chroma DB](https://www.trychroma.com/), a free open source alternative that stores vector embeddings in memory. We will also use the Llama3 8b model for this demo."
+    "A Groq API Key is required for this demo - you can generate one for free [here](https://console.groq.com/). We will be using Pinecone as our vector database, which also requires an API key (you can create one index for a small project there for free on their Starter plan), but will also show how it works with [Chroma DB](https://www.trychroma.com/), a free open source alternative that stores vector embeddings in memory. We will also use the Llama3 8b model for this demo."
    ]
   },
   {
diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/conversational-chatbot-langchain/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/conversational-chatbot-langchain/README.md
index aa2b40f9f3bc6506ea269a4d90b16d75c8375d82..f59ecf878969bbcb0e8a11eac55650ddca3b6da5 100644
--- a/recipes/llama_api_providers/Groq/groq-example-templates/conversational-chatbot-langchain/README.md
+++ b/recipes/llama_api_providers/Groq/groq-example-templates/conversational-chatbot-langchain/README.md
@@ -12,6 +12,10 @@ A simple application that allows users to interact with a conversational chatbot
 
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
 
+<!-- markdown-link-check-enable -->
+
 You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Chatbot-with-Conversational-Memory-on-LangChain) or run it on the command line with `python main.py`
diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/crewai-agents/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/crewai-agents/README.md
index cd41e2eee2bb1fc4161e1974a5c7c10d3a28b7d3..ad8c4848401243b06f32ed9143084083fcb029f0 100644
--- a/recipes/llama_api_providers/Groq/groq-example-templates/crewai-agents/README.md
+++ b/recipes/llama_api_providers/Groq/groq-example-templates/crewai-agents/README.md
@@ -12,9 +12,12 @@ The [CrewAI](https://docs.crewai.com/) Machine Learning Assistant is a command l
 
 - **LangChain Integration**: Incorporates LangChain to facilitate natural language processing and enhance the interaction between the user and the machine learning assistant.
 
-
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
 
-You can [fork and run this application on Replit](https://replit.com/@GroqCloud/CrewAI-Machine-Learning-Assistant) or run it on the command line with `python main.py`. You can upload a sample .csv to the same directory as ```main.py``` to give the application a head start on your ML problem. The application will output a Markdown file including python code for your ML use case to the same directory as main.py.
+<!-- markdown-link-check-enable -->
+
+You can [fork and run this application on Replit](https://replit.com/@GroqCloud/CrewAI-Machine-Learning-Assistant) or run it on the command line with `python main.py`. You can upload a sample .csv to the same directory as `main.py` to give the application a head start on your ML problem. The application will output a Markdown file including python code for your ML use case to the same directory as main.py.
diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/groq-quickstart-conversational-chatbot/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/groq-quickstart-conversational-chatbot/README.md
index d40feb52f00f9ea94a62e28c14837672c7c4d33c..da196d47cec9d650884a343677362ac4964f5ade 100644
--- a/recipes/llama_api_providers/Groq/groq-example-templates/groq-quickstart-conversational-chatbot/README.md
+++ b/recipes/llama_api_providers/Groq/groq-example-templates/groq-quickstart-conversational-chatbot/README.md
@@ -12,6 +12,10 @@ A simple application that allows users to interact with a conversational chatbot
 
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
 
+<!-- markdown-link-check-enable -->
+
 You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Groq-Quickstart-Conversational-Chatbot) or run it on the command line with `python main.py`.
diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/groqing-the-stock-market-function-calling-llama3/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/groqing-the-stock-market-function-calling-llama3/README.md
index c2f9879b3083cf246d6c95de288b6491a4e5b21e..750752cd6bf009fbbcb79b90f3ddd795c8d742aa 100644
--- a/recipes/llama_api_providers/Groq/groq-example-templates/groqing-the-stock-market-function-calling-llama3/README.md
+++ b/recipes/llama_api_providers/Groq/groq-example-templates/groqing-the-stock-market-function-calling-llama3/README.md
@@ -18,6 +18,10 @@ The function calling in this application is handled by the Groq API, abstracted
 
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
 
+<!-- markdown-link-check-enable -->
+
 You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Groqing-the-Stock-Market-Function-Calling-with-Llama3) or run it on the command line with `python main.py`.
diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/llamachat-conversational-chatbot-with-llamaIndex/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/llamachat-conversational-chatbot-with-llamaIndex/README.md
index dd7b906758d6853038e89004f6c2c18578eec7aa..d2b912502d3165e58a4e74f347c3fdc6ee1bc8c2 100644
--- a/recipes/llama_api_providers/Groq/groq-example-templates/llamachat-conversational-chatbot-with-llamaIndex/README.md
+++ b/recipes/llama_api_providers/Groq/groq-example-templates/llamachat-conversational-chatbot-with-llamaIndex/README.md
@@ -14,4 +14,8 @@ A simple application that allows users to interact with a conversational chatbot
 
 ##Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
+
+<!-- markdown-link-check-enable -->
diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/presidential-speeches-rag-with-pinecone/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/presidential-speeches-rag-with-pinecone/README.md
index b7b756cf9f8b551b412a9232b00f5f9de988c339..9e8aa7d6e499d2eb2fe873f6a8d53414683e0e40 100644
--- a/recipes/llama_api_providers/Groq/groq-example-templates/presidential-speeches-rag-with-pinecone/README.md
+++ b/recipes/llama_api_providers/Groq/groq-example-templates/presidential-speeches-rag-with-pinecone/README.md
@@ -22,8 +22,12 @@ The main script of the application is [main.py](./main.py). Here's a brief overv
 
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example outside of this Repl. You can generate one for free [here](https://console.groq.com/keys).
 
+<!-- markdown-link-check-enable -->
+
 You would also need your own [Pinecone](https://www.pinecone.io/) index with presidential speech embeddings to run this code locally. You can create a Pinecone API key and one index for a small project for free on their Starter plan, and visit [this Cookbook post](https://github.com/groq/groq-api-cookbook/blob/dan/replit-conversion/presidential-speeches-rag/presidential-speeches-rag.ipynb) for more info on RAG and a guide to uploading these embeddings to a vector database
 
 You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Presidential-Speeches-RAG-with-Pinecone) or run it on the command line with `python main.py`.
diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/text-to-sql-json-mode/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/text-to-sql-json-mode/README.md
index 3d97eb57e4182af294387cef5c9b461e257d4cab..7e0451cd1b9d0b3166a104ecb51eae388983ad54 100644
--- a/recipes/llama_api_providers/Groq/groq-example-templates/text-to-sql-json-mode/README.md
+++ b/recipes/llama_api_providers/Groq/groq-example-templates/text-to-sql-json-mode/README.md
@@ -38,8 +38,12 @@ A well-crafted system prompt is essential for building a functional Text-to-SQL
 
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
 
+<!-- markdown-link-check-enable -->
+
 You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Building-a-Text-to-SQL-app-with-Groqs-JSON-mode) or run it on the command line with `python main.py`.
 
 ## Customizing with Your Own Data
diff --git a/recipes/llama_api_providers/Groq/groq-example-templates/verified-sql-function-calling/README.md b/recipes/llama_api_providers/Groq/groq-example-templates/verified-sql-function-calling/README.md
index 61537fa1ce3ecd659ec91e356c054b0da51042ce..94f151f41df5d076d0a485b077ad22d1640a17d0 100644
--- a/recipes/llama_api_providers/Groq/groq-example-templates/verified-sql-function-calling/README.md
+++ b/recipes/llama_api_providers/Groq/groq-example-templates/verified-sql-function-calling/README.md
@@ -36,8 +36,12 @@ The verified SQL queries and their descriptions are stored in YAML files located
 
 ## Usage
 
+<!-- markdown-link-check-disable -->
+
 You will need to store a valid Groq API Key as a secret to proceed with this example. You can generate one for free [here](https://console.groq.com/keys).
 
+<!-- markdown-link-check-enable -->
+
 You can [fork and run this application on Replit](https://replit.com/@GroqCloud/Execute-Verified-SQL-Queries-with-Function-Calling) or run it on the command line with `python main.py`.
 
 ## Customizing with Your Own Data