Intermediate Level

INTRODUCTION:

Hi folks! Welcome to an another exciting blog at Gamaka AI. This article basically deals with the use of Python with Gen AI using Google Colab. The advantage of using Google Colab is that it allows you to write and execute Python code directly in your browser without any setup or installation. It is a cloud-based Jupyter Notebook environment provided by Google. Here, we will be considering Gemini – a family of Gen AI created by Google DeepMind known for its multimodality and seamless reasonability across text, code, audio and images. A dynamic, general purpose programming language such as Python can then be leveraged with Gen AI which has a huge potential to generate excellent results.

Models

– Gemini 1.5 flash
– Gemini 1.5 Pro
– Gemini 1.0 Pro

Inputs

Audio, images, videos, and text
Audio, images, videos, and text
Text

Follow These 5 Simple Steps To Get Started With Gemini

In this article, we’ll be dealing with Gemini-1.5-flash-002 to perform the Text-to-Text prompting. Text-to-Text prompting means we will ask Gemini some question and it will generate some output in text form.

Step 1

– Obtain an API key to use Gemini API:
APIs are necessary for smooth data sharing among the applications involved. But before that, we need to generate an API key. So, visit the Google AI Studio, create an API key and just copy it. For security purpose, the key has to be kept secret and should not be shared publicly.

Step 2

– Save the API key:
Open Google Colab and click on the Secrets tab.

Paste the copied API key under the Value tab and assign a name under the Name tab. Here, we have named it as ‘Google_Studio’.
This is how we will interact with the API using Google Colab.

Step 3

– In order to use Gemini, install the required libraries and setup the model:

Step 4

– Check the list of models from Google AI Studio:

Step 5

– Model configuration using Gemini-1.5-flash-002:

Notice the model selected here from the available list of models. It is ‘gemini-1.5-flash-002’.
After that, we have given a sample prompt to the model asking ‘how many months are there in a year?’ to which it generates the correct factual output as ‘12’ since there are 12 months in a year.

So far, we have seen a single prompt response. What if we want to start a whole new chat wherein several Qs will be asked during the chat seeking respective response for them? It’s Possible! Here’s how…
Let’s start a chat using Gen AI:

In case you just want to check how the chat has been between the user and the model for that we will use the chat.history

But the chat.history option does not show the output in a readable format. Therefore, we will use a for loop to make the output easier to understand.

Finally, it is important to note that the more specific and exact the prompt given is, the more satisfying the result will be. Following example substantiates this.

It’s all about giving the exact prompts to get the most right output. However, one must keep in mind that Gemini AI is prone to the following limitations:
1. Bias
2. Hallucination
3. Inconsistency
In the evolving world of GenAI, it becomes imperative to maintain the prompts and output that are privacy driven, ethical, fair, rational, reliable and free from any biases.
More AI coding features like code completions, natural language to code generation are on the way from Colab. Colab will use Codey, a family of code models built on PaLM 2, as it was fine-tuned on a large dataset of high quality and permissively licensed code from external sources to improve performance on coding tasks. These Codey versions have been customized especially for Python and for Colab-specific uses.
Tasks like generating larger blocks of code, writing whole functions from comments or prompts can be achieved from the Natural Language through code generation. This will reduce the need for writing repetitive code, so that you can focus on the more interesting parts of programming and data science.

As of now Google Colab provides us two amazing features. First one is the “Generate with Ai” in the code cell where we get a prompt cell and “Generate” button which allows us to enter any text prompt to generate code.

The second feature is an integrated chatbot i.e. Gemini. Gemini makes getting help easier than ever. We can ask questions directly like, "How do I import data from Google Sheets?" or "How do I filter a Pandas DataFrame?"

Auto-complete suggestions feature is also available in Google Colab as shown in the image below.

Interesting..right? The Codey models inside Colab will help increase programming speed, quality, and comprehension. There are lot more features and improvements coming that will make Colab even more helpful, integrated experience with our data