Skip to content

repositories Search Results · repo:tspeterkim/flash-attention-minimal language:Cuda

Filter by

0 files
 (103 ms)

0 files

intspeterkim/flash-attention-minimal (press backspace or delete to remove)

Flash Attention in ~100 lines of CUDA (forward pass only)
  • Cuda
  • 701
  • Updated
    on Dec 30, 2024
Package icon

Sponsor open source projects you depend on

Contributors are working behind the scenes to make open source better for everyone—give them the help and recognition they deserve.Explore sponsorable projects
ProTip! 
Press the
/
key to activate the search input again and adjust your query.
Package icon

Sponsor open source projects you depend on

Contributors are working behind the scenes to make open source better for everyone—give them the help and recognition they deserve.Explore sponsorable projects
ProTip! 
Press the
/
key to activate the search input again and adjust your query.