Google's Gemini LLM models
Learn more about Google’s Gemini LLM models and how to access them via Google AI Studio and Google Cloud Platform
Google’s Gemini LLM models are known for strong coding ability and task complexity handling (especially Google Gemini 2.5 models).
What are they good for?
What are they good for?
Software development agents, complex task execution.
How much do they cost?
How much do they cost?
Credits charged per 1,000 tokens processed.
What native files can they process?
What native files can they process?
Gemini models can process each of the following files:
- .jpg
- .jpeg
- .png
- .webp
- .gif
- .wav
- .mp3
- .aac
- .ogg
- .aiff
- .flac
- .mp4
- .mpeg
- .mov
- .avi
- .flv
- .mpg
- .webm
- .wmv
Accessing Gemini models using Google AI Studio
If you use Google AI Studio, you can add your API key to Relevance AI, which will allow you to use your Google AI Studio credits as opposed to your Relevance AI credits.
You can add your Google AI Studio API key to Relevance AI by following these steps:
- Generate your API key at Google AI Studio
- Log into Relevance AI and head to ‘Integrations & App Keys’
- Search for ‘Google AI Studio Gemini API Key’ and add your API key
Once you’ve successfully added your API key, you will not be charged Relevance AI credits when you use Google Gemini models in your LLM Tool steps and Agents.
Accessing Gemini models using Vertex AI on Google Cloud Platform
If you use Google Cloud Platform, you can connect your Vertex AI API key to Relevance AI, which will allow you to use your Google Cloud Platform credits as opposed to your Relevance AI credits.
You can do this by following these steps:
Enable your Vertex AI API key
In Google Cloud Console, enable the Vertex AI API by following the instructions here.
Create a service account
Create a service account in the Google Cloud Console following the instructions at https://cloud.google.com/iam/docs/service-accounts-create. Make note of the project ID used to create the account. The service account can have any name and description desired.
Add IAM role
Give the service account the Vertex AI Service Agent IAM role. This means the service account is permitted to use Vertex AI and call LLMs in Google Cloud.
Create private key
Create a JSON private key for the service account, and download the key file.
Extract the client_email and private_key fields
In the key file, extract the client_email and private_key fields exactly as they are written, removing the quotation marks.
The client_email should be an email address that normally ends with iam.gserviceaccount.com.
The private_key should be formatted as follows:
Add details to Relevance AI
Log into Relevance AI and head to ‘Integrations & App Keys’. Search for Google Cloud Platform, and add the Client Email and Private Key, and the Project ID from earlier in step 2.
(Optional) Enter the region
Optionally, if the Gemini models are deployed into a particular Google Cloud Region, enter the region. Otherwise, leave it blank. If you enter a region that isn’t deployed properly, calling models will fail. See https://cloud.google.com/vertex-ai/generative-ai/docs/learn/locations for a list of regions and what the string should look like.
Once your Google Cloud Platform details are added correctly, you will not be charged Relevance AI credits for Google Gemini models.