Skip to content
GitLab
Explore
Sign in
Tags
Tags give the ability to mark specific points in history as being important
This project is mirrored from
https://github.com/lucidrains/lumiere-pytorch
. Pull mirroring updated
Sep 19, 2024
.
0.0.24
5308c890
·
torch.compile friendly pytree
·
Jul 26, 2024
0.0.23
3dcd22be
·
remove groupnorm
https://arxiv.org/abs/2312.02696
·
May 07, 2024
0.0.22
125fd4de
·
module names may not be given to Lumiere in the right order of execution....
·
May 07, 2024
0.0.21
1b26c9fc
·
address
https://github.com/lucidrains/lumiere-pytorch/issues/4
·
May 07, 2024
0.0.20
b806d167
·
fix the forced weight norms for magnitude preserving layers
·
Feb 19, 2024
0.0.19
25b40cdb
·
export the magnitude preserving temporal layers
·
Feb 14, 2024
0.0.18
718443c5
·
Karras shows dropout is still useful
·
Feb 13, 2024
0.0.17
158c91b7
·
complete magnitude preserving temporal unet layers for space-time karras unet
·
Feb 13, 2024
0.0.15
849682e7
·
prepare for magnitude preserving temporal modules
·
Feb 13, 2024
0.0.14
16d69cdd
·
handle channel last, for some unet middles may already be in that format for transformer
·
Feb 13, 2024
0.0.11
4471577e
·
just do some magic and get it working with karras unet for starters
·
Feb 12, 2024
0.0.10
68bf9894
·
works, but only without temporal down and upsampling
·
Feb 11, 2024
0.0.9
6902d8ff
·
make lumiere wrapper work with x-unet and karras-unet for starters
·
Feb 10, 2024
0.0.8
6cb25d7b
·
make lumiere wrapper work with x-unet and karras-unet for starters
·
Feb 10, 2024
0.0.7
7ebfa61b
·
allow time dimension to be specified on init of the four time-related moddules...
·
Feb 10, 2024
0.0.6
957b3c79
·
quick fix
·
Feb 10, 2024
0.0.5
f456ad27
·
use a decorator to handle rearranging to temporal 1d sequence for temporal down and upsample
·
Feb 09, 2024
0.0.4
a7e29675
·
add temporal up and downsample with conv1d and convtranspose1d with bilinear kernel
·
Feb 09, 2024
0.0.3
5116d9e5
·
identity init for inflation modules
·
Feb 09, 2024
0.0.2
60954937
·
unclear what the real structure of the 1d Attention blocks are, so make it configurable
·
Feb 09, 2024
Prev
1
2
Next