Skip to content
GitLab
Explore
Sign in
Tags
Tags give the ability to mark specific points in history as being important
This project is mirrored from
https://github.com/lucidrains/x-transformers
. Pull mirroring updated
Sep 19, 2024
.
0.10.2
f840796f
·
0.10.2
·
Apr 12, 2021
0.10.1
3517f3dc
·
allow residual attention to work with pre norm
·
Apr 09, 2021
0.10.0
e5b03241
·
add rotary positional embedding
·
Mar 25, 2021
0.9.1
9404f86f
·
add ability to return embeddings from vision transformer wrapper
·
Mar 24, 2021
0.9.0
68ce7e7c
·
add rms norm, given transformer modifications paper out of google
·
Mar 21, 2021
0.8.4
c40815f8
·
allow for setting embedding dimension to be different than model dimension
·
Mar 02, 2021
0.8.3
6b93c21b
·
add an assert for relative positional keyword arguments
·
Feb 07, 2021
0.8.2
17617af7
·
add an assert for relative positional keyword arguments
·
Feb 07, 2021
0.8.1
64ee68b6
·
fix residual gating
·
Jan 16, 2021
0.8.0
25dee855
·
add gating at residuals, from deepminds paper for stabilizing txl for RL paper
·
Jan 16, 2021
0.7.4
08cc84e3
·
fix bug with prenorm, introduced when adding residual attention
·
Jan 14, 2021
0.7.3
b82afd71
·
allow floats for ff_mult
·
Jan 11, 2021
0.7.2
208d70d3
·
fix bug with default empty memories for txl
·
Jan 04, 2021
0.7.1
fb7a5d47
·
fix some more issues with T5 rel pos bias, thanks to @adrian-spataru
·
Jan 04, 2021
0.7.0
058bc89a
·
untie embedding, after learning T5 switched to untied classifier weights in...
·
Jan 04, 2021
0.6.7
399efcbf
·
bump bug fix release
·
Jan 03, 2021
0.6.6
6dda4874
·
avoid potential issue with PAR and transformer-xl recurrence
·
Jan 02, 2021
0.6.5
af23656c
·
add PAR, credit goes to @lunixbochs
·
Jan 02, 2021
0.6.4
9cf70013
·
fix bugs introduced from attention intermediates and also with feedforward...
·
Jan 02, 2021
0.6.3
15b327ae
·
make shortformer possible
·
Jan 01, 2021
Prev
1
…
14
15
16
17
18
19
20
21
Next