Skip to content

[Executorch][llama] bug fix for custom sdpa for attention bias #10284

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Apr 21, 2025

Conversation

When using attention bias dont override seq length for causal attention

Differential Revision: [D73222733](https://our.internmc.facebook.com/intern/diff/D73222733/)

[ghstack-poisoned]
Copy link

pytorch-bot bot commented Apr 17, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/10284

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

❌ 1 New Failure, 1 Unrelated Failure

As of commit 6af8457 with merge base 06f912d (image):

NEW FAILURE - The following job has failed:

BROKEN TRUNK - The following job failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D73222733

@kimishpatel kimishpatel added the release notes: examples Changes to any of our example LLMs integrations, such as Llama3 and Llava label Apr 18, 2025
…bias"

When using attention bias dont override seq length for causal attention

Differential Revision: [D73222733](https://our.internmc.facebook.com/intern/diff/D73222733/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D73222733

…bias"

When using attention bias dont override seq length for causal attention

Differential Revision: [D73222733](https://our.internmc.facebook.com/intern/diff/D73222733/)

[ghstack-poisoned]
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D73222733

@facebook-github-bot facebook-github-bot merged commit 45b5fe7 into gh/kimishpatel/182/base Apr 21, 2025
151 of 161 checks passed
@facebook-github-bot facebook-github-bot deleted the gh/kimishpatel/182/head branch April 21, 2025 23:07
kirklandsign pushed a commit that referenced this pull request Apr 21, 2025
Pull Request resolved: #10284

When using attention bias dont override seq length for causal attention
ghstack-source-id: 279292323
@exported-using-ghexport

Differential Revision: [D73222733](https://our.internmc.facebook.com/intern/diff/D73222733/)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. fb-exported release notes: examples Changes to any of our example LLMs integrations, such as Llama3 and Llava
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants