Skip to content
Snippets Groups Projects
Unverified Commit 3f1fef7a authored by Geeta Chauhan's avatar Geeta Chauhan Committed by GitHub
Browse files

adding flash attention and xformer memory efficient through PT SDPA (#97)

parents 4056a459 c3a11c4f
No related branches found
No related tags found
Loading
Loading
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment