Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

⬆️ Update flash-attn requirement from ~=2.6.3 to >=2.6.3,<2.8.0 #93

Open
wants to merge 1 commit into
base: develop
Choose a base branch
from

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Dec 2, 2024

Updates the requirements on flash-attn to permit the latest version.

Commits
  • 641db75 [CI] Pytorch 2.5.1 does not support python 3.8
  • 7435839 Update README for FA3
  • 241c682 [CI] Switch back to CUDA 12.4
  • c555642 Bump to v2.7.0
  • 6ffeb57 [CI] Still use CUDA 12.3 but pull the right pytorch version
  • 42f2b8b Use CUDA 12.4 in the build system (#1326)
  • 2f6c633 Drop support for Pytorch 2.0
  • 88d1657 [AMD ROCm] Fix KVcache bug and improve performance (#1328)
  • 284e2c6 Make FA3 paged attention ready for upgrade to Cutlass 3.6 (#1331)
  • b443207 Paged Attention support for FA3 (#1268)
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Updates the requirements on [flash-attn](https://github.com/Dao-AILab/flash-attention) to permit the latest version.
- [Release notes](https://github.com/Dao-AILab/flash-attention/releases)
- [Commits](Dao-AILab/flash-attention@v2.6.3...v2.7.0.post2)

---
updated-dependencies:
- dependency-name: flash-attn
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file python Pull requests that update Python code labels Dec 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file python Pull requests that update Python code
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants