Google Promises "Smarter Photo Albums Without the Work"
Smarter albums picks the best photos and videos from your gallery.
Among the many jobs that Google has, it’s also now a personal scrapbooker. The time-suck of scrolling through hundreds of photos and selecting the best selfies to share with your friends after your vacation is no more. Starting today, the new “smarter albums” update in the Google Photos app will sort through your videos and images in your phone’s gallery and create a special album with “your best shots.”
But how does it know what your “best shots” actually are?
The smarter albums feature tracks all the steps of your vacation, documenting which restaurants you went to, what trails you hiked, what monuments you visited. It adds maps, curates all the images, and then selects the highlights of your trip. You can customize the album by adding in comments and location pins. And in a fashion familiar to anybody who uses Google Docs, you can also invite others to collaborate and add their photos to your smart album. Your albums that already exist in Google Photos can also be customized with smarter albums.
Google writes in a blog posted today, “We’re taking the best of stories and bringing them to albums, so your adventures are easier to browse, edit, collaborate on, and share.”
If this “best memories” idea sounds familiar, it could be because several social media platforms already do something similar. Twitter, Facebook, (and soon Instagram will) skim through content and stories with machine-learning algorithms that choose what it thinks we will like. Twitter has its timeline algorithm, Facebook has its “top stories” news feed, and Instagram is planning on building an algorithm that will select which photos and videos it thinks you will like best and present them at the top of your feed.
Google Photos, launched back in May 2015, already has some pretty powerful search and image recognition abilities. The app backs up all your photos on all your devices to the cloud and already neatly organizes them by people, places, pets, etc. Tom Keane, a software engineer on Google Photos, explained on Quora how the app’s categorization works:
“The high-level flow is that we send photos to a ‘Visual Recognition System’ that labels the photos with various tags based on the content of the image. The system might respond with tags like: “Cat - 90% confidence”, “Couch - 50% confidence”, “Eiffel Tower - 80% confidence”, etc. These labels are computed (mostly) using neural networks trained using Google Brain.”
There’s no need to go to Google Play or the App Store if you already have the latest 1.16 version of Google Photos on your phone. The smarter albums feature will automatically show up on the app. The update rolled out today on Android and iOS.