Skip to content

Commit a5e0672

Browse files
committed
Update README to add link to Mixtral MoE folder
1 parent 89b2502 commit a5e0672

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

README.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ This is *NOT* intended to be a "framework" or "library" - it is intended to show
1414

1515
For an in-depth walkthrough of what's in this codebase, see this [blog post](https://pytorch.org/blog/accelerating-generative-ai-2/).
1616

17+
We supported [Mixtral 8x7B](https://mistral.ai/news/mixtral-of-experts/) which is a high-quality sparse mixture of experts (MoE) model, see [this page](./mixtral-moe) for more details.
18+
1719
## Community
1820

1921
Projects inspired by gpt-fast in the community:

0 commit comments

Comments
 (0)