- I tried out Apple Intelligence on an iPhone 16 Pro Max over the first few days of its rollout.
- The iOS 18.1 update introduced AI tools like writing aids and photo editing on newer iPhones.
- I found the AI tools streamlined tasks but had limitations.
I’ve been testing out Apple Intelligence for 48 hours, and it’s already changing the way I use an iPhone.
For transparency, I own an iPhone 14 Pro Max — an older model that can’t support Apple’s newly rolled-out AI features. However, I’ve been using Apple Intelligence on my partner’s brand new iPhone 16 Pro Max since its launch on Monday, a handset Apple previously said is built for AI.
Once Apple Intelligence was enabled on his phone, the difference in functionality compared to my device was immediately obvious. The setup can take time since there are many apps that are compatible with it, and some of them require their own adjustments in the Settings app.
Although iOS 18.1 includes only a fraction of the AI capabilities Apple has shown off since the Worldwide Developers Conference in June, the update includes AI-powered tweaks to the apps I use daily.
The “writing tools” feature, the “clean up” photo-editing tool, and email summaries in the Mail app are among the AI capabilities available on updated iPhone 15 Pro models or later.
Now that I’ve had time to try out Apple Intelligence, there are two functions I like the most — and which could make communication much easier for me as someone who works in the media.
Much of my job consists of texting and calling sources as I’m working on stories. The AI features introduced to the Messages and Phone apps have streamlined that part of my news-gathering process.
I can have lengthy text message responses summarized without having to read them right away, and I can easily catch up on ongoing conversations in a group chat.
The tools aren’t perfect, and Apple has said that outcomes can vary when it comes to Apple Intelligence. The notification summaries can leave out key details that might make me want to read a message immediately, but it’s a solid starting point for Apple.
Usually, when I interview sources, I set my phone to speaker mode and hold it up so my laptop can record and transcribe the message. With Apple Intelligence, I can record and transcribe calls with a button.
For those worried about privacy, the feature notifies both parties that the call is being recorded. While there are some inconsistencies between the transcriptions and the audio, that’s standard for most recording tools.
The communications tools might not be as flashy as the upcoming image generator or visual intelligence that Apple said will continue rolling out in December and over the course of next year, but they have made my daily iPhone use a little less clunky.
Some analysts have said the AI rollout could be the start of an upgrade cycle for iPhones, something Apple needs as sales slow and competition rises.
With the lowest-priced iPhone that supports Apple Intelligence, the iPhone 15 Pro, starting at $899, however, I don’t feel the benefits are enough to make my iPhone 14 Pro Max feel obsolete — yet.
Read the full article here