Skip to content
GitLab
Explore
Sign in
Tags
Tags give the ability to mark specific points in history as being important
This project is mirrored from
https://github.com/lucidrains/x-transformers
. Pull mirroring updated
Sep 19, 2024
.
0.14.3
d4aa4507
·
0.14.3
·
Jul 19, 2021
0.14.2
746aebe3
·
clone pre and post softmax attention that is saved to the intermediate,...
·
Jul 18, 2021
0.14.1
9593cd55
·
make emb_dropout optional
·
Jul 18, 2021
0.14.0
f44f2318
·
0.14.0
·
Jul 16, 2021
0.12.4
f6f9e7fd
·
0.12.4
·
Jul 15, 2021
0.12.3
c19aeb89
·
0.12.3
·
Jul 08, 2021
0.12.2
f0cee389
·
0.12.2
·
Jun 29, 2021
0.12.1
a11b1785
·
0.12.1
·
May 26, 2021
0.12.0
89733390
·
release collaborative heads
·
May 18, 2021
0.11.11
9f14fa30
·
apply rotary embeddings to values as well
·
May 13, 2021
0.11.10
7b6ecb42
·
fix calculation of max rotary embedding length
·
May 11, 2021
0.11.9
e5c4a668
·
Merge pull request #41 from lucidrains/pw/scale-t5
·
May 10, 2021
0.11.8
857b2763
·
make sure rotary positional embedding works with transformer-xl memories
·
May 10, 2021
0.11.7
2231eac4
·
release patch
·
May 10, 2021
0.11.6
4e9984fa
·
0.11.6
·
May 08, 2021
0.11.5
abf602ea
·
0.11.5
·
Apr 21, 2021
0.11.4
4b395aba
·
do partial rotary dimensions, clamped at dimension of 32 at minimum
·
Apr 18, 2021
0.11.2
87233591
·
simpler faster rotary embeddings
·
Apr 18, 2021
0.11.1
a5004251
·
revert accidentally checked-in code
·
Apr 17, 2021
0.11.0
f638ced8
·
add continuous transformer wrapper for @guillefix
·
Apr 17, 2021
Prev
1
…
13
14
15
16
17
18
19
20
21
Next