forked from vllm-project/vllm
-
Notifications
You must be signed in to change notification settings - Fork 63
Pull requests: HabanaAI/vllm-fork
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
Fix for random sampler recompilations for incomplete batches
#663
opened Dec 30, 2024 by
mfylcek
Loading…
Fix: selecting correct backend for MultiHeadAttention
habana
Issues or PRs submitted by Habana Labs
#645
opened Dec 18, 2024 by
adobrzyniewicz-habana
Loading…
Fix model OOM issue in llama-405 and mixtral - 2nd attempt
habana
Issues or PRs submitted by Habana Labs
#644
opened Dec 18, 2024 by
afierka-intel
Loading…
Multimodality fix for llava
habana
Issues or PRs submitted by Habana Labs
#641
opened Dec 17, 2024 by
adobrzyniewicz-habana
Loading…
[WIP] Add HPU support to vLLM v1 - cont.
#609
opened Dec 10, 2024 by
kzawora-intel
Loading…
21 of 23 tasks
Bump aiohttp from 3.10.10 to 3.10.11
dependencies
Pull requests that update a dependency file
#536
opened Nov 21, 2024 by
dependabot
bot
Loading…
Previous Next
ProTip!
Filter pull requests by the default branch with base:habana_main.