Replies: 1 comment
-
|
It's been quite some time since this question was posted, but I think the document was ultimately published under another title: Arctic Long Sequence Training (ALST) for HF Transformers integration |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
It looks like a document talking about this got published early June 2025 but it is not longer there:
Ulysses Plus enables even longer sequence lengths using a bag of tricks: Activation checkpoint offload to CPU; Tiled MLP compute; Liger-kernel ...
https://www.deepspeed.ai/tutorials/ulysses-plus-sequence-pallellism
It sounds like a great idea but I wonder this feature will be available sometime soon or if it has been cancelled.
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions