Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Please Publicize #6

Closed
ehartford opened this issue Nov 26, 2024 · 8 comments
Closed

Please Publicize #6

ehartford opened this issue Nov 26, 2024 · 8 comments

Comments

@ehartford
Copy link

Can you please publicize this and make it consumable?

If you do not, then nobody really can use it. (except researchers who stumble on it like me)

It needs to either:

  1. be merged with the main flash attention repo or
  2. be given a unique name (perhaps FlexHeadFA or something) and added to pypi so that we can 'pip install flexheadfa'

This can be a big contribution to open source AI development, if you would make it accessible.

Thank you,
Eric Hartford

@dingo-actual
Copy link

☝️

@xiayuqing0622
Copy link
Owner

@ehartford Thank you for your suggestion, Eric!

We actually created a PR to the main FlashAttention repository several months ago, with the intention of merging it there. However, since the PR has been stalled for quite some time, giving it a unique name and publishing it to PyPI might indeed be a better approach.

We’ll work on this and make it accessible soon. Thank you for highlighting its potential contribution to the open-source AI community!

@ehartford
Copy link
Author

ehartford commented Nov 27, 2024

Can you please reference the PR to Flash attention?I will try to bring attention to it

Is it this one? Dao-AILab#1166

@xiayuqing0622
Copy link
Owner

Can you please reference the PR to Flash attention?I will try to bring attention to it

Is it this one? Dao-AILab#1166

Yes, and we've created a related issue to query about this yesterday. Dao-AILab#1358

@ehartford
Copy link
Author

Thankfully he made an explicit decision

@xiayuqing0622
Copy link
Owner

xiayuqing0622 commented Nov 28, 2024

We’ll work on publishing it to PyPI then. :)

@xiayuqing0622
Copy link
Owner

Hi @ehartford , We’ve just published it to PyPI! You can install it using: pip install flex-head-fa --no-build-isolation (We went with the name you suggested :)). Please note that this version is not yet synced with the latest main FlashAttention repository. We’ll work on updating it soon.

Thank you for your suggestion.

@ehartford
Copy link
Author

Thank you! Looking forward to using it!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants