-
Notifications
You must be signed in to change notification settings - Fork 71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Anyway to make chat async? #39
Comments
Hi, that is a great idea! But I am not sure if it is possible in vim. Do you have any clue how to do that or do you know any plugin that does something similar? |
There is a library called denops that can implement vim plugins with deno. https://github.com/vim-denops/denops.vim Here is an example of a plugin that just runs the chatgpt API asynchronously using denops (about 130 lines of code). https://github.com/tani/hey.vim I'd be happy to get a reference if you'd like. |
Wow vim-denops looks really cool! Seems they manipulate vim buffer from a different process what makes all processing asynchronous. The only problem I can see with this approach is that it would probably mess up the undo tree after completion (user would need to hit undo many times to undo a single completion). Anyway at this point I don't want to introduce any 3rd-party dependencies into vim-ai. |
I attempted to try chat gpt's web browsing capabilities by asking chat gpt 4 how it would make this repository (I gave it a link) async and — though it didn't actually browse/do anything with the link — this is what it said: Prompt: https://github.com/madox2/vim-ai It'd be nice if it was asynchronous -- what do you think would be the easiest way to do that without introducing any third party plugin dependencies? Response:
Please note that these are just examples and you would need to replace In the case of the Vim plugin you linked to, the easiest way to make it asynchronous would probably be to use job control if it's targeted at NeoVim, or channels if it's targeted at Vim 8. This way, the OpenAI API call could be executed in a separate process, and the result could be processed when it's done. This would avoid blocking the editor while waiting for the API response. |
Maybe this could be useful? https://github.com/prabirshrestha/async.vim |
Looks cool but this plugin does not execute external commands like bash but rather use built-in python. |
Is there a way we have this feature implemented? Although this plugin is really cool, it loses to the web version everyday. Features you have while using the web version:
Features you have while using the plugin:
I would argue that the vim features are really low when comparison with the web version. |
I disagree, I use this (along with the .aichat files) almost 100% of the time. Cost is a non-issue, yeah open ai can subsidize it/offer a free version on their site because they developed the models but how is @madox2 supposed to have it be free in this plugin? Also it's pretty cheap, especially with the new GPT4 Turbo. Agree async would be nice/seems like it should be possible though. |
The web version also allows for that. I do like that this plugin also offers it.
I'm not saying this plugin should have a work around the open api pricing, let's not jump into conclusions... Given the fact that you are stuck waiting for the job to finish it's no different than using the web version and the web version is free (for now). With that, the original quoted statement still remains true.
But I'll rephrase it for a better meaning of what I meant. |
Although it is not an ideal solution, you could chat in the new terminal within vim, that would make it async. (e.g. spawn a new vim instance in the split |
Regarding could also be used by vim-ai to open a job channel? |
That looks promising and worth experimenting. Just wondering how it would update chat buffer without interrupting user doing different actions and if it could keep consistent undo sequences |
I don't know if this could be helpful: https://github.com/tpope/vim-dispatch they seem to have async stuff running in the background. For LLMs it is maybe more complex as it impacts the buffer. |
I'd suppose that the main difficulty is that the output is continuously written into the buffer. Apparently |
Great plugin, thanks. I find myself mostly using it via chat, and sometimes waiting around while it types a long answer. I know neovim/vim 8 have async features -- would this be possible/hard to build in?
The text was updated successfully, but these errors were encountered: