ylliX - Online Advertising Network

Say Goodbye To Single Commands: Gemini AI Can Now Perform Multi-Step Actions

Image Source: “P365x52-12: Control, Option, Command” by kurafire is licensed under CC BY 2.0. https://www.flickr.com/photos/62449696@N00/8375271656

You can listen to the audio version of the article above.

Google just announced some cool new stuff for its Gemini AI at Samsung’s big phone launch event! Gemini is getting a major upgrade, especially for the latest Samsung phones (like the new S25). But don’t worry, some of these new features will also work on older Samsung S24 and Pixel 9 phones.

The biggest news is that Gemini can now do multiple things in a row. Imagine this: you ask Gemini to find restaurants near you using Google Maps, and then tell it to send a text to your friends inviting them to lunch, all without lifting a finger!

This new “chaining” ability will work on any device with Gemini, but it depends on whether developers have made the apps work with Gemini. Luckily, all the main Google apps already work, and even some Samsung apps like Calendar, Reminders, and Notes are ready to go!

Google’s Gemini is getting even more human-like! Gemini Live, the part that lets you have a conversation with the AI like you would with a friend, is getting a big upgrade, especially when it comes to understanding different kinds of information.

Now, you can show Gemini Live pictures, files, and even YouTube videos! Imagine asking Gemini, “Hey, can you check out this picture of my school project and give me some feedback?” and then actually showing it the picture. That’s what’s possible now.

Unfortunately, this fancy new upgrade only works on the latest Samsung phones (the S24 and S25) and the Pixel 9 for now.

And there’s one more thing! Google is bringing something called “Project Astra” to phones in the next few months, starting with the Galaxy S25 and Pixel phones. This is a whole new kind of AI assistant that lets you interact with the world around you using your phone’s camera.

Picture this: you’re walking down the street and see a cool building. Just point your phone at it and ask Gemini, “What’s the history of this building?” or “What kind of architecture is this?” You can even ask things like, “When is the next bus coming?”

But it gets even cooler. Project Astra is designed to work with Google’s special AI glasses. Imagine wearing these glasses and just asking Gemini questions about what you see, without even having to take out your phone! It’s like having your own personal AI tour guide wherever you go.

Okay, imagine this: you’re walking down the street, and you see this awesome old building. You’re curious about it, but who are you going to ask? With Project Astra, you just whip out your phone, point it at the building, and ask Google’s AI, “Hey, what’s the story with this place?” It’s like having your own personal tour guide in your pocket!

Or let’s say you’re on vacation in a foreign country, and you’re trying to figure out the bus schedule. No problem! Just point your phone at the bus stop, and ask, “When’s the next bus coming?” It’s like magic!

But wait, it gets even cooler. Google is working on these special AI glasses. With these glasses, you can just look at something and ask a question, like, “Hey, what kind of tree is that?” or “How much does that coffee cost?” No phone needed! It’s like having a superpower.

Project Astra is still being developed, but it gives you a taste of what the future could be like. Imagine a world where you can get instant information about anything you see, just by asking. It’s like having the entire internet in your eyeballs!

It’s still early days, but Project Astra has the potential to change the way we learn, explore, and interact with the world around us. It’s a glimpse into a future where technology seamlessly blends with our everyday lives, making everything easier and more fun.

Leave a Comment