-
Notifications
You must be signed in to change notification settings - Fork 528
[Executorch][llama] bug fix for custom sdpa for attention bias #10284
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Executorch][llama] bug fix for custom sdpa for attention bias #10284
Conversation
When using attention bias dont override seq length for causal attention Differential Revision: [D73222733](https://our.internmc.facebook.com/intern/diff/D73222733/) [ghstack-poisoned]
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/10284
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: ❌ 1 New Failure, 1 Unrelated FailureAs of commit 6af8457 with merge base 06f912d ( NEW FAILURE - The following job has failed:
BROKEN TRUNK - The following job failed but were present on the merge base:👉 Rebase onto the `viable/strict` branch to avoid these failures
This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D73222733 |
…bias" When using attention bias dont override seq length for causal attention Differential Revision: [D73222733](https://our.internmc.facebook.com/intern/diff/D73222733/) [ghstack-poisoned]
This pull request was exported from Phabricator. Differential Revision: D73222733 |
…bias" When using attention bias dont override seq length for causal attention Differential Revision: [D73222733](https://our.internmc.facebook.com/intern/diff/D73222733/) [ghstack-poisoned]
This pull request was exported from Phabricator. Differential Revision: D73222733 |
45b5fe7
into
gh/kimishpatel/182/base
Pull Request resolved: #10284 When using attention bias dont override seq length for causal attention ghstack-source-id: 279292323 @exported-using-ghexport Differential Revision: [D73222733](https://our.internmc.facebook.com/intern/diff/D73222733/)
Stack from ghstack (oldest at bottom):
When using attention bias dont override seq length for causal attention
Differential Revision: D73222733