Opinion

Why Samsung's Galaxy AI Actually Feels Like the Future of Phones

Galaxy AI can do wild things like translate phone calls live, but it's always focused on utility first rather than something that's only futuristic.

by Ian Carlos Campbell
A hand holding a Samsung Galaxy S24 Ultra with a bunch of apps on the screen.
Photograph by Raymond Wong

Galaxy AI, if everything Samsung demoed at its Unpacked event works as shown, is our first glimpse of a realistic vision of how AI fits into the phone of the future. Samsung’s Galaxy S24 line doesn’t look too entirely different from the S23 phones that came before them, in fact, Galaxy AI is really their only key differentiator, save a new color, display, or material. But what’s really surprising about the new AI suite is that rather than gamble on some kind of futuristic all-purpose assistant, Samsung seems to have decided AI should be useful first.

The company has shared its plans to develop an AI model internally that it calls “Gauss,” with specialized versions for handling a diverse set of skills ranging from summarizing emails to writing code. But what we’ve seen of Galaxy AI, at least for now, seems to be driven by Google. Samsung and Google’s partnership has borne fruit in the past, and it certainly seems like it could again for whatever an “AI phone” is. Samsung’s practical pitch, contrary even to the exciting one Google made for the Pixel 8 last year, is that the best AI might make using a smartphone easier, and it seems like it might just work.

A Better Camera and a Better Phone

The Galaxy S24 and S24+ have the same three cameras but new AI-powered software to make them perform even better.

Photograph by Raymond Wong

Cameras — the sheer volume and quality of them — have been one of the main ways Samsung differentiates itself from its competitors. A focus on providing Galaxy phone owners novel, and at times outlandish, camera capabilities has drawn criticism before, but what’s being offered with the S24 line’s new “ProVisual Engine” seems like a meaningful step up. There are the usual improvements to night photography and a clearer digital zoom, but the editing features are the real stars. Generative Edit on the Galaxy S24s can fill in the background of an image if you resize, crop, or straighten it. Objects and people in photos can be moved entirely and Galaxy AI will attempt to fill in the background and adapt. Samsung has even come up with a way to convincingly remove reflections, and turn regular videos into slow-mo ones after the fact.

If these skills sound familiar, it’s because they were introduced in one way or another on the Pixel 8 and 8 Pro as part of Google’s Magic Editor feature. The difference is, Samsung’s implementation, particularly the ability to fill in backgrounds after you straighten or crop an image, just seems much more realistic for an average person to use. It’s fascinating Google decided to let people completely change the background of an image on the fly, but it’s also just not really a thing most people want to do regularly.

One of the more gutsy Galaxy AI features launching with the S24 is something called Live Translate. At its most tame, you’ll be able to use an Interpreter app and translate between dozens of languages in a split-screen view, with responses in your chosen language displayed on the bottom screen facing you, and your words in the language of whoever you’re speaking to displayed on the other side. But those live translations can also appear in your default messaging app, and even wilder still, live phone calls. According to Samsung, when you make or receive a call, all you have to do is tap a few buttons and your S24 will automatically translate your conversation. Samsung is so confident in the S24’s Live Translate abilities that you can have the feature on by default, so all calls in your non-native language will be translated. It seems foolish at best to trust these translations with important conversations in your life, but it’s also hard to not be impressed by how simply this improves your phone as an actual communication tool.

A Better Computer

Circle to Search changes how you look for things with your phone by making it even more intuitive.

Photograph by Raymond Wong

The tweaks Galaxy AI introduces do more than just make typical smartphone features better, there’s at least one that could shift behavior entirely. Circle to Search, which Google is launching first with the S24 line (and also bringing to the Pixel 8 and 8 Pro) makes any image or piece of text searchable, just by long pressing your phone’s home button and then circling or underlining whatever you want to learn more about. You can even add text search terms for more specific results. It’s a feature that makes perfect sense for the S Pen on the Galaxy S24 Ultra, but should work just as well with a finger.

Circle to Search builds on concepts like Google Lens, which uses image recognition to identify and search for information, “multisearch,” Google’s technology for combining image and text searches, and even older Google Assistant skills that let you receive information about things on your current phone screen. But it trumps all of those by making a lot more sense. It’s inherently easier to circle something on your screen than it is to type in a long phrase or snap a photo. It’s also inherently insular. Circle to Search keeps you on your phone rather than engaging with the world, but in the context of the wider conveniences, Galaxy AI is bringing to your phone and that doesn’t feel bad.

That’s really the strangest part of Samsung and Google’s partnership, Galaxy AI, and these new S24 phones. Google seems to be providing a lot of the technological muscle — at the very least, servers in Google Cloud, the Imagen 2 model for generating things in photos, and various forms of the Gemini AI model — and is even capable of doing a lot of the things Galaxy AI does. But Samsung has just done a much better job connecting the dots between AI ideas in a way that a normal, smartphone-carrying, TikTok-scrolling, online-shopping person can follow. They made an AI phone for normies, and they’re selling it well.

Not a New Kind of Device (Yet)

A constant refrain as more and more pieces of AI hardware have been shown to the public is that most of the functionality of the Humane Ai Pin or the Rabbit R1 could just be a phone app or a new version of Siri. While boring and not much fun, that’s not really an incorrect statement. I do think it’s a potentially misinformed one, though. I really don’t think we know what a fully AI-driven device looks like yet and it’s not clear you’d want to use whatever that new interface will be, especially if it’s managing all of your apps and services. It could very well be too much at once. Too risky, when you can get so many benefits just by making normal smartphone features work a little better.

That’s what’s refreshing about Samsung’s Galaxy AI. It’s positively restrained in comparison to the other AI devices out there. Possibly more restrained than whatever the Pixel 9 is able to do. That’s unusual from Samsung, but it makes sense for the largest maker of Android phones, and it explains why Google is introducing a feature that seems critical to future Android devices on a far less futuristic feeling smartphone. It’s just pragmatic.

Related Tags