ChatModel supplier for google vertex model #1198
Unanswered
vadimlevitzky
asked this question in
Q&A
Replies: 2 comments
-
What is not working? |
Beta Was this translation helpful? Give feedback.
0 replies
-
I implemented a chat model supplier class: public class MyChatModelSupplier implements Supplier<ChatLanguageModel> {
/**
* 1. Enable Vertex AI in Google Cloud Console
* 2. Set your own project and location below
*/
private static final String PROJECT = "**************************";
private static final String LOCATION = "*************************";
private static final String MODEL_NAME = "gemini-1.5-pro-002";
@Override
public ChatLanguageModel get() {
/*
return OpenAiChatModel.builder()
.apiKey("SOME_KEY")
.modelName("gpt-4o")
.logRequests(true)
.logResponses(true)
.build();
*/
return VertexAiGeminiChatLanguageModel.builder()
.projectId(PROJECT)
.location(LOCATION)
.modelId(MODEL_NAME)
.logRequests(true)
.logResponses(true)
.build();
} works I mean getting response from the model |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I m a little confused, there is a VertexAiGeminiChatModel in langchain4j ver 1.0.0-alpha1, and in Quarkus project there is a different class named, VertexAiGeminiChatLanguageModel in the langchain4j ver 0.22.0 extension for quarkus.
When I m using in quarkus sample project the VertexAiGeminiChatLanguageModel is not working for me, checked also with openai chatModel and it was working. I checked the samples of langchain4j and run a main with VertexAiGeminiChatModel and it also worked. I don't understand what I m missing?
I m trying to implement a ChatModelSupplier and manually create the ChatLanguageModel using it's builder method. I m using the annotation @RegisterAiService(chatLanguageModelSupplier = MyChatModelSupplier.class), my supplier implementation works if I m using an OpenAI chat model builder.
Beta Was this translation helpful? Give feedback.
All reactions