A new generative AI feature, “Help me edit,” is now being rolled out to a wider range of Android devices, ending its previous exclusivity to Google’s Pixel phones. The feature, which is part of Google Photos, uses natural language processing to enable users to perform complex photo edits with simple, conversational commands. This move democratizes a powerful AI tool that was previously a key selling point for Google’s own hardware, making it available to a broader segment of the Android ecosystem.
The feature works by allowing users to type or speak prompts to direct the AI to make specific changes to their photos. For example, a user can say, “Give the sky a sunset effect” or “Remove the person in the background,” and the AI will perform the edit automatically. This functionality is integrated into the existing Magic Editor and combines both on-device and cloud-based AI to perform the edits.

The rollout is specifically targeted at all Android phones running Android 16, which is the latest version of the OS. The rollout is expected to begin in late September and continue into October 2025. It’s a strategic decision that makes a flagship-level AI feature more accessible across the platform.
While the feature is being made available to more users, there are some distinctions. The sources note that non-Pixel users who do not have a Google One subscription will be limited to a certain number of free edits per month, while Pixel owners and Google One subscribers will have unlimited access. This approach maintains a value proposition for Google’s subscription service while still giving all Android users a taste of the new generative AI capabilities.
