Google I/O 2024 Sundar Pichai has introduced his new multimodal AI agent Project Astra. This AI tool is capable of answering users’ questions in real time. Its special thing is that it can do conversation in real time through text, audio or video. Google had shown a demo video of this new AI agent in its developers conference. This project of Google will give direct competition to Astra Open AI’s GPT-4o multimodel.
How does Project Astra work?
The special thing about this new AI agent of Google is that it has information about everything kept in the room. If you give camera access to this tool, it will tell you about whatever it sees. If you want to ask any other question, then if you point at something visible in the camera, it will also tell about it.
Google has shared a demo video of this tool with its GoogleDeepMinds X handle, in which it is easily telling about everything kept in the room. This tool of Google can be accessed through smartphone and smart glasses. This tool based on Google Gemini AI is integrated with Google Lens.
Google has claimed that Project Astra has the ability to process any information rapidly. It can encode video frames and show all the information related to it. Google has made many changes in the new AI Assistant to make it look natural.
Will find lost items easily
This AI assistant of Google is similar to GPT-4o of OpenAI. It also provides information about anything to the users through pictures, videos and text messages. You can see this in the demo shared by GoogleDeepMind. However, some more features have been given in it.
With Project Astra you can ask questions by drawing straight lines on the screen. Also, it remembers everything seen in the video, due to which it will tell you whenever you ask questions related to it. If you forget something in the room and Project Astra has captured it in its camera, it can help you find that item.