Skip to content
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
Update on "[CP] Enable FlexCP for llama3"
Summary:

Continue the previous PR, this PR enable FlexAttention + CP for llama3. FlexCP will use PTRRLoadBalancer.

Note that this PR requires pytorch/pytorch#170201

[ghstack-poisoned]
  • Loading branch information
fegin committed Dec 15, 2025
commit c4f3886f6d92cd16239530bbee81e3a5427e0a72

This merge commit was added into this branch cleanly.

There are no new changes to show, but you can still view the diff.