Skip to content
GitLab
Explore
Sign in
Tags
Tags give the ability to mark specific points in history as being important
This project is mirrored from
https://github.com/lucidrains/x-transformers
. Pull mirroring updated
Sep 19, 2024
.
1.17.1
7395ebd9
·
1.17.1
·
Aug 04, 2023
1.16.24
1cfead36
·
add the "simple" rmsnorm from the new transnormer paper
·
Jul 30, 2023
1.17.0
1cfead36
·
add the "simple" rmsnorm from the new transnormer paper
·
Jul 30, 2023
1.16.23
792aa6dc
·
patch
·
Jul 26, 2023
1.16.22
db58c041
·
add same functionality as add_zero_attn in pytorch mha, with attn_add_zero_kv = True
·
Jul 25, 2023
1.6.21
3451614a
·
quick fix
·
Jul 17, 2023
1.16.20
c6b6d2e3
·
address
https://github.com/lucidrains/x-transformers/issues/166
·
Jul 17, 2023
1.16.19
83189efe
·
address
https://github.com/lucidrains/x-transformers/issues/165
·
Jul 16, 2023
1.16.18
503fd484
·
ensure mask being passed into flash attention is expanded across heads
·
Jul 13, 2023
1.16.17
cbd834fe
·
address
https://github.com/lucidrains/x-transformers/pull/161
, which works by...
·
Jul 08, 2023
1.16.16
43277515
·
bring back rotary positional interpolation
·
Jun 30, 2023
1.16.15
d2e0700a
·
just be consistent with masking convention
·
Jun 30, 2023
1.16.14
f58100f4
·
1.16.14
·
Jun 30, 2023
1.16.12
2a5bbdd8
·
address masking logic for flash attention
·
Jun 30, 2023
1.16.11
4fd11a53
·
remove rotary interpolation for now 1.16.11
·
Jun 30, 2023
1.16.10
8fa7b4ca
·
help out @cutoken at
https://github.com/lucidrains/x-transformers/issues/159
·
Jun 29, 2023
1.16.9
114fab9e
·
quick fix, thanks to @prestonyun
·
Jun 14, 2023
1.16.8
43c508ae
·
1.16.8
·
Jun 03, 2023
1.16.7
f71f3279
·
address
https://github.com/lucidrains/x-transformers/issues/152
·
May 31, 2023
1.16.6
84fcaf6d
·
fix cascading heads in the presence of alibi pos bias
·
May 27, 2023
Prev
1
…
5
6
7
8
9
10
11
12
13
…
21
Next