This project is mirrored from https://github.com/meta-llama/llama-recipes.
Pull mirroring failed .
Repository mirroring has been paused due to too many failed attempts. It can be resumed by a project maintainer or owner.
Last successful update .
Repository mirroring has been paused due to too many failed attempts. It can be resumed by a project maintainer or owner.
Last successful update .
- Dec 05, 2024
-
-
Miguel authored
-
- Dec 03, 2024
-
-
Miguel authored
-
- Nov 30, 2024
-
-
Miguel authored
-
- Nov 25, 2024
-
-
Miguel authored
-
- Nov 21, 2024
- Nov 20, 2024
-
-
Kai Wu authored
-
Jayson Francis authored
-
Kai Wu authored
-
Kai Wu authored
All functionality has been consolidated into a single file for CLI/UI/Checkpointing and Added fix for issue 702 and added code for that as well, added instructions in local_inference /README.md as well (#757)
-
Kai Wu authored
-
Jeff Tang authored
-
Jeff Tang authored
-
- Nov 19, 2024
-
-
Guanghui Qin authored
Fix a typo. The FSDP wrapper should wrap the `MllamaCrossAttentionDecoderLayer`, which was missing.
-
JimChienTW authored
-
Himanshu Shukla authored
-
Himanshu Shukla authored
-
Himanshu Shukla authored
-
Himanshu Shukla authored
-
JimChienTW authored
-
Himanshu Shukla authored
-
- Nov 16, 2024
-
-
Sanyam Bhutani authored
-
JimChienTW authored
-
JimChienTW authored
-
- Nov 15, 2024
-
-
Himanshu Shukla authored
-
Jeff Tang authored
-
- Nov 13, 2024
-
-
Suraj Subramanian authored
-
Suraj Subramanian authored
-
- Nov 02, 2024
-
-
Himanshu Shukla authored
-
Himanshu Shukla authored
-
Himanshu Shukla authored
-
Himanshu Shukla authored
-
Himanshu Shukla authored
Added complete inferencing functionality of 1. terminal inferencing, 2. gradio inferencing, 3. checkpoint inferencing in UI/CLI
-
Himanshu Shukla authored
added working in single file for 1. terminal inferencing, 2. gradio inferencing, 3. checkpoint inferencing
-
Himanshu Shukla authored
-
- Nov 01, 2024