From 05dc4430bd843c5b9407041b6ce01965261d12d5 Mon Sep 17 00:00:00 2001 From: tronikos Date: Sat, 18 May 2024 04:22:08 -0700 Subject: [PATCH] Google Generative AI: Add a clarification on the google_generative_ai_conversation.generate_content (#32827) Co-authored-by: Franck Nijhof --- .../google_generative_ai_conversation.markdown | 6 ++++++ 1 file changed, 6 insertions(+) diff --git a/source/_integrations/google_generative_ai_conversation.markdown b/source/_integrations/google_generative_ai_conversation.markdown index 71a3a217c3c..836f9efe9a0 100644 --- a/source/_integrations/google_generative_ai_conversation.markdown +++ b/source/_integrations/google_generative_ai_conversation.markdown @@ -52,6 +52,12 @@ Maximum Tokens to Return in Response: ### Service `google_generative_ai_conversation.generate_content` +
+ + This service isn't tied to any integration entry, so it won't use the model, prompt, or any of the other settings in your options. If you only want to pass text, you should use the `conversation.process` service. + +
+ Allows you to ask Gemini Pro or Gemini Pro Vision to generate content from a prompt consisting of text and optionally images. This service populates [response data](/docs/scripts/service-calls#use-templates-to-handle-response-data) with the generated content.