Google Maps is getting smarter with the integration of Gemini, Google’s AI assistant, bringing a more intuitive and hands-free navigation experience to Android and iOS users. The new update enables drivers to interact with Maps conversationally, reducing the need for screen touches while on the move.

Hands-free, multi-step navigation
With Gemini’s AI support, users can perform complex tasks through simple voice commands, like finding nearby spots, checking EV charger availability, sharing ETAs, or adding events to the calendar.
You can even ask multi-layered questions such as, “Find a budget-friendly restaurant with vegan options along my route” or “What’s parking like there?” Drivers can also report incidents hands-free by saying, “I see an accident” or “Watch out for that slowdown.”
Landmark-based directions
Google Maps now offers more natural navigation cues by referencing visible landmarks such as cafes, buildings, or gas stations – for instance, “Turn right after the Thai Siam Restaurant.” The feature combines Street View data with Maps’ massive database of over 250 million locations to make directions more relatable and easier to follow.
Proactive Traffic Alerts
Even when not actively navigating, users will receive AI-powered alerts about heavy traffic, road closures, or accidents, helping them plan routes ahead of time.
Lens powered by Gemini
Once users reach a destination, Lens with Gemini lets them explore surroundings by simply pointing their camera, identifying restaurants, shops, or landmarks, and even asking contextual questions like, “What is this place and why is it popular?”
Google Maps Update Availability
- Hands-free navigation: Android & iOS (Gemini-enabled regions); Android Auto support coming soon
- Landmark-based directions: Rolling out in the U.S. on Android and iOS
- Proactive alerts: Rolling out in the U.S. on Android
- Lens with Gemini: Gradual rollout later this month in the U.S. on Android and iOS
