I’ve used the Samsung Galaxy S24’s new AI – this is the best new tool
The Samsung Galaxy S24 has just been announced, which means it’s this week’s phone to launch a thousand hot takes. Next week it’ll be another phone such as the the OnePlus 12. The year has started at a brisk pace when it comes to smartphone launches, and the hype for every new device is intense.
But most people don’t buy a new phone every year or actually need all the new things you can find on the latest iPhone or Android device. You might love tech, but could be fed up with hearing about the ways phone companies think their new devices are going to totally change your life.
Samsung is convinced that AI is the next big thing for you and your phone. The Galaxy S24 series have new ‘Galaxy AI’ features built in, so naturally the firm is saying they are game changing. The company’s President, TM Roh, said the S24 “transforms our connection with the world and ignites the next decade of mobile innovation”.
Well, I’ve had the chance to use the new Galaxy AI features on the Galaxy S24 to find out if that’s really the case. Samsung has loaded the S24, S24 Plus, and S24 Ultra with clever AI tools such as chat assist, which can change the tone of your texts before you send them (which works when you send a text to any app, so long as you use Samsung Keyboard) and note assist, which can summarise notes from typed text or audio, which it can also transcribe.
I even saw a demo of real time live translation on a call, which can translate you and the person you’ve called for each other from two of 13 pre-selected language options at launch. It’s impressive, and works from the Samsung Phone app but you can call any number, any phone.
But I think the pick of the new features on the Galaxy S24 is called ‘circle to search’, which uses screengrabs and AI with Google search to change how you can use images to find information. Think of it as a beefed up Google Lens exclusive to the S24 (for now) – it takes away the inherent friction of Googling something you don’t know by letting you screen grab whatever is on your phone’s display by holding down the home button and then circling the image you want to know more about with your finger (or the S Pen if you’ve got the Ultra).
This works on literally any screen – your photos, Instagram, TikTok, any webpage, anything – it scans what’s on the screen and then links to Google search using AI to identify what it is and list search results. There’s even a delightful haptic rumble under your finger as you circle the image.
This even works in the camera’s viewfinder without taking a photo. I held up the S24 Ultra to my Garmin Forerunner 265 watch and held down the power button, circled the watch, and hey presto, circle to search quickly identified not only that it was a Garmin, but the exact model too.
From there you can tap in the search bar to add to the search from that point. In this instance you can type something like ‘reviews’ or ‘where to buy’ and it quickly takes you along the right search path. Obviously Samsung is doing this with a ton of help from Google and its search engine, but that’s why I think it’s so good – it takes search, something we do every day, and evolves it into something more useful than what we had before.
You can quickly search an image of the delicious looking plate of food you saw on your friend’s Stories and type ‘recipe’ and it spits out what you need and what the dish is. I tried it and it works surprisingly accurately and very fast, though I was connected to 5G in an area Samsung had set up precisely for demos.
Express.co.uk is getting the Galaxy S24 Ultra in for review, so I plan to put circle to search and all of Samsung’s other new AI tools through their paces to see if I buy into the AI future the company thinks phones have in store for us, or rather if they are ‘nice to have’ tools that will remain niche and only used by a select few rather than everyone who buys the phone.