Skip to content
Snippets Groups Projects
Commit 44ef280d authored by Hamid Shojanazeri's avatar Hamid Shojanazeri
Browse files

adding flash attention and xformer memory efficient through PT SDPA

parent dd57dc5c
No related branches found
No related tags found
No related merge requests found
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment