From ace827dd5298846dbf9541f3e1708cbf33a697a6 Mon Sep 17 00:00:00 2001 From: Sanyam Bhutani <sanyambhutani@meta.com> Date: Fri, 18 Oct 2024 13:41:00 -0700 Subject: [PATCH] fix link 1 --- recipes/experimental/long_context/H2O/README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/recipes/experimental/long_context/H2O/README.md b/recipes/experimental/long_context/H2O/README.md index 675e1ef6..b73d8706 100644 --- a/recipes/experimental/long_context/H2O/README.md +++ b/recipes/experimental/long_context/H2O/README.md @@ -36,7 +36,7 @@ Expected results on XSUM (Rouge-2 score, the higher the better) from the above s ### One Demo on Streaming to "Infinite" Context Length -The following example demonstrates the generation process of "infinite" sequence length. We use MT-Bench data and generate the context sample-by-sample. The KV Cache will keep the KV pairs from the previous samples while maintain a fixed size. Results can be found on [Demo](https://allenz.work/?p=11) (Video 1). +The following example demonstrates the generation process of "infinite" sequence length. We use MT-Bench data and generate the context sample-by-sample. The KV Cache will keep the KV pairs from the previous samples while maintain a fixed size. ``` # run with full cache -- GitLab