Skip to content

[Executorch][llama] Allow custom sdpa op replacement pass to leverage attention mask #10341

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Apr 21, 2025

Conversation

pytorchbot
Copy link
Collaborator

This PR was created by the merge bot to help merge the original PR into the main branch.
ghstack PR number: #10285 by @kimishpatel
^ Please use this as the source of truth for the PR details, comments, and reviews
ghstack PR base: https://github.com/pytorch/executorch/tree/gh/kimishpatel/183/base
ghstack PR head: https://github.com/pytorch/executorch/tree/gh/kimishpatel/183/head
Merge bot PR base: https://github.com/pytorch/executorch/tree/gh/kimishpatel/182/orig
Merge bot PR head: https://github.com/pytorch/executorch/tree/gh/kimishpatel/183/orig
@diff-train-skip-merge

Copy link

pytorch-bot bot commented Apr 21, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/10341

Note: Links to docs will display an error until the docs builds have been completed.

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Apr 21, 2025
@kirklandsign kirklandsign force-pushed the gh/kimishpatel/182/orig branch from e372822 to 704898c Compare April 21, 2025 23:11
Base automatically changed from gh/kimishpatel/182/orig to main April 21, 2025 23:11
… attention mask

Pull Request resolved: #10285

Previously we assumed that the custom sdpa always does causal attention.
This diff adds option to this module swap pass to make custom sdpa leverage
attention mask instead of causal.
ghstack-source-id: 279292324
@exported-using-ghexport

Differential Revision: [D73222736](https://our.internmc.facebook.com/intern/diff/D73222736/)
@kirklandsign kirklandsign force-pushed the gh/kimishpatel/183/orig branch from 1213ca6 to 6908d14 Compare April 21, 2025 23:11
@kirklandsign kirklandsign merged commit 8eebbcd into main Apr 21, 2025
9 of 11 checks passed
@kirklandsign kirklandsign deleted the gh/kimishpatel/183/orig branch April 21, 2025 23:12
@kirklandsign kirklandsign added the release notes: examples Changes to any of our example LLMs integrations, such as Llama3 and Llava label Apr 21, 2025
keyprocedure pushed a commit to keyprocedure/executorch that referenced this pull request Apr 21, 2025
… attention mask (pytorch#10341)

Previously we assumed that the custom sdpa always does causal attention.
This diff adds option to this module swap pass to make custom sdpa leverage
attention mask instead of causal.

Differential Revision: [D73222736](https://our.internmc.facebook.com/intern/diff/D73222736/)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. release notes: examples Changes to any of our example LLMs integrations, such as Llama3 and Llava
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants