Resources
Insights

Apple just rebranded AI

Apple wants to build the most personal AI experiences yet

By
Fawzi Ammache
June 10, 2024

Rebranding AI

Apple just rebranded AI.

In classic Apple fashion, they’ve created a new word for it: “Apple Intelligence”. Throughout the keynote, Apple only said “AI” twice. They never used "Gen AI". Clearly, they're trying to separate themselves from the competition by using other terms like “personal intelligence”, "personal intelligence system", and "generative models". It's exactly how they separated the Apple Vision Pro from the AR/VR/Metaverse branding, and created the spatial computing term instead.

It’s a clever move, because it avoids the negative connotations of the word “artificial”. It also separates Apple from the negative press and sentiment that “artificial” intelligence gets around ethics and privacy.

Now, any AI that isn’t “Apple Intelligence” is just “artificial”.

This angle perfectly fits Apple’s narrative: they want to create the most private and personal AI experiences you’ll ever use, unlike the generic and underwhelming experiences we face with other AI tools that know nothing about us and require heavy prompting and rich context-setting.

If you missed the announcements, here’s a summary of what Apple Intelligence is all about.

Writing Tools + ChatGPT integration

Apple is adding Writing Tools as an OS-level feature, which means you’ll be able to rewrite, summarize, or compose text across any app across your Apple devices.

For example, you can highlight an email you’re writing and rewrite it in a more professional tone.

You can open up your Notes and read a summary…

… and even get a summarized view of your inbox so you can quickly glance and catch up on what’s important.

The biggest update is that Apple will allow you to integrate ChatGPT into Writing Tools and Siri. You’ll be able to use ChatGPT text, image, and document analysis capabilities across your Apple devices without going to the ChatGPT website or app.

The ChatGPT integration is an interesting move. Apple makes billions of dollars per year from Google paying them to be the default search engine on Apple devices ($20 billion in 2022 alone). I wonder what deal OpenAI and Apple agreed to, and how much Microsoft was involved.

Maybe Apple determined that building and training their own LLM isn’t a worthy investment because of the costs, complexities, and competition in the space. Or maybe they don’t want to get sued by everyone for stealing data to train their own models and sign expensive data licensing deals.

Image Generation

Apple is also making image generation accessible across its ecosystem of apps like Messages, Notes, and Keynote. There will also be a dedicated Image Playground app where you can generate images.

They’re taking a more prudent approach by only letting people generate images in 3 styles:

  • Sketch
  • Illustration
  • Animation

These are safer choices to start with and can protect Apple from bad PR if deepfakes are generated using their devices. They focused on 3 styles that are lighthearted and fun.

What I love about the image generation features is the UI:

No fancy prompt engineering needed. Just tap a few buttons, select the style you want, and add a short description if needed. You could even use the image of the person you’re texting and transform them into an animated character.

Other image-related features they announced:

  • Genmoji: generate new emojis with a simple prompt
  • Photo Editing: easily remove/erase people or items from your camera roll images
  • Image Wand: transform a sketch into an image

Siri gets a glow-up

The next evolution of AI models is creating AI agents that can understand your intent and take appropriate actions within a given context. This is clearly a huge area of focus for Apple, with Siri at the center of it.

Siri has gotten a lot of bad press throughout the years but it might’ve gotten the rebrand and glow-up we’ve been waiting for.

Not only does Siri look visually different, it can now tap into your personal context to answer questions you may have. This means it can search across your messages, notes, calendar events, and files and bring up relevant information. I’ll be using this to find all those book recommendations that were sent to me but never wrote down.

Siri will also gain on-screen awareness, meaning it can answer questions or take actions based on what you’re seeing on your screen in the moment. For example, you could be looking at an event in Safari and tell Siri to add that event into your Calendar.

Apple also revealed the App Intents API, allowing third-part app developers to tap into Siri’s new brain and action-taking capabilities. This is an important move to make the Apple ecosystem even more valuable since you can make all the apps “talk” to each other to complete complex tasks.

What’s next

Apple Intelligence will only be available on iPhone 15 Pro models and any Apple device with an M-type chip. The idea is to run AI on the device itself, instead of sending your data to a cloud server for processing, which makes it more private. However, some complex tasks (and any request sent to ChatGPT) will have to be completed on a cloud server meaning your data will leave your device.

Fawzi Ammache
Founder, Year 2049

Become an AI Pro

An email a week with the AI knowledge you seek.

Never miss Year 2049's latest resources, courses, and more by subscribing to our weekly newsletter.

Unsubscribe anytime. By registering you agree to Substack's Terms of Service, Privacy Policy, and Information Collection Notice
Thank you! Your submission has been received!
Oops! Something went wrong. Please try again.