You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
WebGUI Chat: Now, we ship a chat UI with our docker image!
Embeddings: From now on, in our docker, by default, we are starting the model responsible for the embeddings. That is necessary for the VecDB support.
Shared Memory Issue Resolved: A critical performance issue related to shared memory has been fixed. For more details, check out the GitHub issue.
Anthropic Integration: We've implemented an ability to add API keys to use third-party models!
stable-code-3b: The list of available models is growing! This time, we added stabilityai/stable-code-3b!
Optional API Key for OSS: Refact.ai Self-hosted version can now use an optional API key for security if deployed on a cloud.
Build Information: In the settings, you can now find the About page, which includes information about packages that are used, versions, and commit hashes.
LoRa Switch Fix: The issue with switching between LoRas (didn't show information in logs) is now fixed!
VLLM Out-of-Memory (OOM) Fix: We've fixed an out-of-memory issue with VLLM for the Refact.ai Enterprise!