Android

Google’s Project Astra takes Gemini AI into the real world

Summary

  • Google Search adds video queries, but Project Astra’s AI eyewear offers more intuitive real-time information lookup.
  • Gemini AI on Astra greatly optimizes several everyday tasks, like interpreting hand-drawn references and generating band names.
  • Google’s potential wearable Project Astra combines smart glasses with advanced AI capabilities for real-life assistance.



For years now, Google Search has been very textual, with utilities like reverse image search and, more recently, Google Lens, adding a degree of convenience through visual searches. Although the company announced support for video queries in Search at its annual developer conference keynote today, Gemini AI has enabled a whole new way of looking up information using AI in near real time with Project Astra.


Related

Google I/O 2024: Everything announced so far

It’s all about AI this year — here’s everything Google’s talked about so far

During the Google I/O keynote presentation, Google DeepMind founder Demis Hassabis spoke about the technical challenges of multimodal input processing for AI interactions in Gemini. The presentation then cut to two consecutive clips, with the first featuring real-time audiovisual prompts as the presenter asks the Gemini app on their phone to identify objects, create poetic rhymes, summarize on-screen code, and identify a location based on the view out of a window like a GeoGuessr player.

However, the cooler bit was the shorter segment that followed, where the presenter donned a bulky pair of smart eyewear with cameras and Gemini access to make the mobile app experience more, well… mobile.




Project Astra is the Google Glass we deserve

Codenamed Project Astra, this eyewear looks like typical smart glasses with a chunky design reminiscent of the Bose Frames, but with capabilities resembling Meta’s smart glasses in collaboration with Ray-Ban. Using the glasses, the presenter could point the AI’s attention to whatever they were looking at more intuitively, without managing a secondary device.

In the first example, Gemini on Astra optimized a server flowchart by adding a cache, then understood a hand-drawn reference to the Schrödinger’s cat paradox, and lastly picked a humorous name for a band comprising a golden retriever and his toy tiger.

“A universal AI agent that can be helpful in every aspect of life”

— Demis Hassabis


For what it’s worth, Google claims the video was not sped up and was recorded in a single take. Even if it was sped up, it’s amazing how much eyewear can do with cameras, speakers, and Google’s latest tech advancements. Another remarkable element is the immediacy of the AI’s response, which would lead us to believe the glasses are paired with more powerful hardware like a phone because networking and onboard processing on the spectacles would add considerable bulk to the design.

Although the company did not commit to timelines for a potential launch of Project Astra as a tangible product, we are glad the company has not abandoned wearable computing since Google Glass failed to take off.




Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
WP2Social Auto Publish Powered By : XYZScripts.com

Adblock Detected

Please consider supporting us by disabling your ad blocker