@Google
  @Google
Google | Project Astra: Our vision for the future of AI assistants @Google | Uploaded 4 months ago | Updated 7 hours ago
Introducing Project Astra. We created a demo in which a tester interacts with a prototype of AI agents supported by our multimodal foundation model, Gemini.

There are two continuous takes: one with the prototype running on a Google Pixel phone and another on a prototype glasses device.

The agent takes in a constant stream of audio and video input. It can reason about its environment in real time and interact with the tester in a conversation about what it is seeing.

Learn more about Project Astra: https://goo.gle/3wAUwFh

#GoogleIO2024

Watch the full Google I/O 2024 keynote: youtube.com/live/XEzRZ35urlk?si=Yo3HTskWQMJwdf3G

To watch this keynote with American Sign Language (ASL) interpretation, please click here: youtube.com/live/6rP2rEWsfpM?si=RuVQFeK4ESnOsUnp

#GoogleIO

Subscribe to our Channel: youtube.com/google
Find us on X: twitter.com/google
Watch us on TikTok: tiktok.com/@google
Follow us on Instagram: instagram.com/google
Join us on Facebook: facebook.com/Google
Project Astra: Our vision for the future of AI assistantsMeet Pixel 9 Pro’s updated Magnifier app (Audio Described)Battle of the #travel trends ⛺️🆚🚢 Are you team #campsites or #cruises?College textbook knowledge from @OpenStax now in Gemini. #ChatWithGemini and prep for that exam.Pete and Dorothy try Gemini for the first time and make magic. 🎵Visual Artists x Imagen | Google Lab SessionsHow is AI helping revolutionize disease research? | Dialogues Dispatch Podcast | Episode 3Can AI help firefighters manage wildfires? | FireSatDemystifying Large Language ModelsHow will AI reshape our economy and the workforce? | Dialogues Dispatch Podcast | Episode 2How Startups are Using AI for Healthcare | Google for StartupsGoogle I/O 2024 Keynote: Gemini for Workspace

Project Astra: Our vision for the future of AI assistants @Google

SHARE TO X SHARE TO REDDIT SHARE TO FACEBOOK WALLPAPER