Skip to content

Commit 63fee4f

Browse files
fix copy-pasted block comment
1 parent a9592ca commit 63fee4f

File tree

1 file changed

+2
-4
lines changed

1 file changed

+2
-4
lines changed

python/sglang/srt/layers/attention/hip_radix_attention.py

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,8 @@
11
from __future__ import annotations
22

33
"""
4-
Support different attention backends.
5-
Now there are two backends: FlashInfer and Triton.
6-
FlashInfer is faster and Triton is easier to customize.
7-
Each backend supports two operators: extend (i.e. prefill with cached prefix) and decode.
4+
HiP Attention Backend for SGLang
5+
https://arxiv.org/pdf/2406.09827
86
"""
97

108
import logging

0 commit comments

Comments
 (0)