草莓传媒

Data Doctors: Why Google Photos search feels broken

Q: Why is Google Photos AI search so bad and can I turn it off?

A: If searching your photos 鈥 and especially your videos 鈥 feels worse than it used to, you鈥檙e not imagining it. Many users can no longer find pictures or clips they know are there, even though the same searches worked perfectly fine in the past.

The issue isn鈥檛 that your content disappeared; it鈥檚 that Google Photos changed how its search works 鈥 and not in a way that always matches how we remember things.

From keywords to confidence scores

Google Photos used to behave like a filing system. If a photo had location data, dates or simple tags, search would show it 鈥 even if the match wasn鈥檛 perfect.

Now, search is driven by artificial intelligence that tries to interpret what鈥檚 actually in your photos and videos. It looks for faces, objects, scenes, text and activities, then assigns an internal confidence score. Only photos and videos that pass their 鈥渃onfidence threshold鈥 appear in search results.

That鈥檚 the key change, and the main source of frustration for users.

Why simple searches work better than specific ones

When you search for one word, such as 鈥渄olphin,鈥 Google only has to answer a single question: Is there a dolphin here? That鈥檚 often easy, so results appear.

But when you search 鈥渄olphins in Greece,鈥 Google treats that as one combined idea. Now it must be confident about multiple things at once: that there are dolphins, that there may be more than one and that the photo or video was taken in Greece. If any part of that is uncertain 鈥 weak GPS data, offshore photos or older images 鈥 the result gets filtered out.

Nothing is missing. The AI just isn鈥檛 confident enough to show it.

Why videos are even worse

Videos suffer the most under this system. A photo is judged from one moment. A video contains hundreds or thousands of frames, and Google has to decide what the video is about.

If a dolphin appears briefly in a long ocean clip, the AI may decide it鈥檚 an 鈥渙cean video,鈥 not a 鈥渄olphin video.鈥 If a sunset only dominates part of the clip, it may not count as a sunset at all. If the main subject isn鈥檛 obvious across most frames, the video often won鈥檛 show up in a search.

That鈥檚 why scrolling by date and scrubbing through videos still works 鈥 search hides uncertainty, scrolling shows everything.

There鈥檚 no way to go back but you can make searching better

Many people look for a setting to turn the AI off. In most versions of Google Photos, there isn鈥檛 one anymore. AI-based search is now baked into the core of the app. The old keyword-driven system is gone.

So instead of changing the app, you have to change how you search in it.

Start broad, then narrow:

  • Search for the strongest visual concept (鈥渄olphin,鈥 鈥渟unset,鈥 鈥渞eceipt鈥)
  • Then, narrow by date or suggested location
  • Avoid stacking multiple ideas into one search

For anything that really matters:

  • Add a short caption (for example 鈥渄olphins in Greece鈥)
  • Put photos and videos into albums with descriptions
  • Mark key items as favorites

Captions and albums don鈥檛 rely on AI confidence 鈥 they give the system anchors it won鈥檛 second-guess.

Get breaking news and daily headlines delivered to your email inbox by signing up here.

漏 2026 草莓传媒. All Rights Reserved. This website is not intended for users located within the European Economic Area.

Federal 草莓传媒 Network Logo
Log in to your 草莓传媒 account for notifications and alerts customized for you.