From 201daff2d1829e031cc2e0520688259fb0e91351 Mon Sep 17 00:00:00 2001
From: Suraj Subramanian <5676233+subramen@users.noreply.github.com>
Date: Wed, 27 Mar 2024 11:01:17 -0400
Subject: [PATCH] Add note on CUDA version + remove 'test' from pytorch whl url

---
 README.md | 15 +++++++++------
 1 file changed, 9 insertions(+), 6 deletions(-)

diff --git a/README.md b/README.md
index 45a48ff3..01558a90 100644
--- a/README.md
+++ b/README.md
@@ -38,24 +38,27 @@ Some features (especially fine-tuning with FSDP + PEFT) currently require PyTorc
 ### Installing
 Llama-recipes provides a pip distribution for easy install and usage in other projects. Alternatively, it can be installed from source.
 
+> [!NOTE]
+> Ensure you use the correct CUDA version (from `nvidia-smi`) when installing the PyTorch wheels. Here we are using 11.8 as `cu118`
+
 #### Install with pip
 ```
-pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes
+pip install --extra-index-url https://download.pytorch.org/whl/cu118 llama-recipes
 ```
 
 #### Install with optional dependencies
 Llama-recipes offers the installation of optional packages. There are three optional dependency groups.
 To run the unit tests we can install the required dependencies with:
 ```
-pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes[tests]
+pip install --extra-index-url https://download.pytorch.org/whl/cu118 llama-recipes[tests]
 ```
 For the vLLM example we need additional requirements that can be installed with:
 ```
-pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes[vllm]
+pip install --extra-index-url https://download.pytorch.org/whl/cu118 llama-recipes[vllm]
 ```
 To use the sensitive topics safety checker install with:
 ```
-pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 llama-recipes[auditnlg]
+pip install --extra-index-url https://download.pytorch.org/whl/cu118 llama-recipes[auditnlg]
 ```
 Optional dependencies can also be combines with [option1,option2].
 
@@ -65,14 +68,14 @@ To install from source e.g. for development use these commands. We're using hatc
 git clone git@github.com:meta-llama/llama-recipes.git
 cd llama-recipes
 pip install -U pip setuptools
-pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 -e .
+pip install --extra-index-url https://download.pytorch.org/whl/cu118 -e .
 ```
 For development and contributing to llama-recipes please install all optional dependencies:
 ```
 git clone git@github.com:meta-llama/llama-recipes.git
 cd llama-recipes
 pip install -U pip setuptools
-pip install --extra-index-url https://download.pytorch.org/whl/test/cu118 -e .[tests,auditnlg,vllm]
+pip install --extra-index-url https://download.pytorch.org/whl/cu118 -e .[tests,auditnlg,vllm]
 ```
 
 
-- 
GitLab