| | Hi, friends. It’s Shira. I’m turning over this inbox space to my colleague Geoff, who has THOUGHTS about what’s good and not-so-good about Apple’s new artificial intelligence features. | (Illustration by The Washington Post; iStock) | This week, you can take a first bite of Apple Intelligence, which Apple says is a more useful, more thoughtful, more private, more made-for-iPhone version of artificial intelligence. After testing Apple Intelligence features on my iPhone for months, I’ve found that the AI still doesn’t do much — and sometimes doesn’t act intelligent at all. Using Apple’s AI also appears to drain my phone’s battery faster. To get Apple Intelligence on your iPhone 16 or 15 Pro, recent Mac or recent iPad, you’ll have to opt in — and possibly join a waiting list for a few hours. First you have to download a software update for iOS 18.1, then toggle a switch under Settings to request Apple Intelligence. One thing Apple Intelligence does that you can’t get from ChatGPT or Google Gemini is summarizing all your iPhone lock-screen notifications. That’s mildly useful except when it goes bananas at least once or twice per day. One example: Last Thursday, Apple AI summarized a news headline as, “Steve Anderson urges Harris to endorse Harris.” (The actual original headline was, “Fellow General Steve Anderson Tells John Kelly Why He Must Endorse Harris Now.”) Apple didn’t answer my questions about why the battery on my year-old iPhone 15 Pro lasts only until about 3 p.m. now that I’m using Apple Intelligence. (Many factors can influence battery life, but this seems like more than just a coincidence.) | | To get Apple Intelligence on your iPhone, you need to be running iOS 18.1 on an iPhone 15 Pro or 16, and turn it on with this toggle you’ll find by tapping Settings, then Apple Intelligence & Siri. (Washington Post illustration; Goeffrey A. Fowler/The Washington Post via Apple/TWP) | Apple smartly understands it has a monopoly over much of the data and screen real estate on your iPhone and can use AI to summarize, organize and edit it for you. And I like the more cautious approach Apple Intelligence takes with privacy, running functions locally on your device or on a special cloud service so your personal data isn’t accessible to anyone else. (That’s a big improvement over Meta, Microsoft and Google.) The problem is, Apple’s AI capabilities are behind industry leaders — by more than two years, according to some Apple employees cited by Bloomberg. Apple says there’s more to come from Apple Intelligence, including the ability to generate images, and some much-needed upgrades to its Siri assistant. In recent days, Apple senior executives have suggested in interviews that an underwhelming debut is all part of its plan. They said Apple only wants to release AI products that get it “right” or are “ready.” But is it actually ready? Apple Intelligence feels annoying and unfinished just often enough that I wouldn’t blame anyone for leaving it switched off for now, or waiting another year before buying a new iPhone. To help you decide, here are highlights and lowlights of what Apple Intelligence does in its current form. What the new Apple Intelligence features do today Summaries of notifications and emails What’s good: Glance at your lock screen and see short summaries of your messages, news alerts and other notifications. This could help you catch up quickly. My favorite Apple Intelligence feature: In the Inbox view of your Mail app, the two lines under the name of the sender and subject now include a short AI summary of the whole message. That’s more useful than previewing whatever happened to be in the first two lines of the email. | | Apple Intelligence lock screen notification summaries sometimes get facts like names wrong, like in this example news headline where conflated Kamala Harris with John Kelly. (Washington Post illustration; Geoffrey A. Fowler/The Washington Post/TWP) | What needs work: The summaries are right most of the time — but just often enough are bonkers. On Saturday, it mis-summarized a chat about costumes as “Pugsley is little fester.” (What?) And a text from a friend became: “Feedback received, well-received.” All AI is bad at humor and social context, but Apple Intelligence tries and often fails at summarizing texts where it truly isn’t in on the joke — and takes up space on your lock screen with its drivel. I’ve also seen it get confused about names and convey the exact opposite of the meaning in a message. Sometimes the AI mistakes border on misinformation. On Tuesday, Apple Intelligence incorrectly summarized a Washington Post news alert to say: “Harris rally features Elon Musk and Jeff Bezos.” (That didn’t happen. And Bezos owns The Post.) Apple didn’t have an immediate comment. You can, in the settings for notifications, turn off the summaries. The Mail app also tries to pull out certain messages it thinks are “priority” and put them at the top of your inbox. For me, it gets this wrong frequently — for example, leaving calendar invites up there long after I’ve already added them to my calendar. ‘Clean up’ photos What’s good: See a stray hand in your photo? In the photos app, tap edit and then on a new eraser icon. You can tap, circle or “brush” what you want to remove and replace it with an AI-generated background that matches the rest of the photo. | | An Apple employee shows an example of Apple Intelligence in the iPhone 16 Pro during the Apple Event in Cupertino, California, in September. (Junne Alcantara/The Washington Post) | What needs work: It struggles at more mildly complicated editing. When I tried to clean up power lines in a photo of trees at sunrise, it could erase about half of the power lines. But Google Photos — which first launched its “magic eraser” function back in 2021 — did a better job, especially with the lines that intersected with trees. Meanwhile, Google Photos has evolved more AI capabilities, including making sure the faces of people in your photos are looking forward and smiling. Writing help What’s good: Select some text in a note or email you’re drafting, and Apple Intelligence will offer to proofread or rewrite it in a tone that is “friendly,” “professional” or “concise.” | An Apple employee shows another example of Apple Intelligence. (Junne Alcantara/The Washington Post) | What needs work: The Apple Intelligence proofreading function incorrectly told me to use the word “skepticically” instead of “skeptically” in a draft of this column. And I don’t know why I’d rely on Apple Intelligence for writing help and not one of the more capable tools built into the software from Microsoft or Google that most people use to do their writing. Google’s Gemini, for example, can rewrite paragraphs following any style description you enter, including a poem. ‘Smarter’ Siri What’s good: Apple’s assistant has a flashy new interface, and you can stumble over your words and Siri might still understand you. It didn’t flinch to offer the San Diego weather when I asked, “What’s, um, the sun like today down over in San Jose — no, I mean San Diego?” What needs work: Siri still isn’t as smart as any of the leading chatbots, and its speaking voice can’t carry on a conversational chat that approaches their realistic-sounding human voices. Siri also still answers most complex questions with a Google search. A promised ability to send complex questions to ChatGPT is coming with the next software update, says Apple. Most concerning, Siri still doesn’t really know me. I can’t ask “when is my next haircut?” and have it pull the answer from my calendar for me. This kind of “personal context” is something Apple touted would come with Apple Intelligence. Someday. | | |