Apple Intelligence Will Bring Personal AI To Mac, iPhone, and iPad
Apple has included its neural engine Neural Processing Unit (NPU) in all Apple processors for several generations, using it for native machine learning such as intelligent photo processing. Now, in a recent WWDC keynote, Apple announced that they will use these neural engines to implement generative and large-scale language models (LLMs) that can run on devices for their core Apple products. This is classic Apple approach, laying the ground work well in advance, but then releasing a new product feature when it feels it can bring the right “Apple” level user experience and solve customer pain points.
After the first hour of the keynote describing updates to the various Apple operating systems (hey, iPad got an amazing calculator), Apple’s CEO Tim Cook introduced the company’s answer to generative AI – Apple Intelligence. Apple Intelligence will be able to operate on local devices, but there will also be Apple Cloud element for more sophisticated queries and even the ability to use Openais ChatGPT and other third -third party generative AI (Genai) products. This allows Apple users to choose privacy or get additional functionality from third -party solutions so that Apple can remain competitive Microsoft and Google. Cook said they want to create an AI that's powerful enough, efficient enough, intuitive enough, easy to use, personal enough to be a real assistant, and can be deeply integrated into the Apple experience and build privacy into it from the ground up. . Cook calls Apple's intelligence service personal intelligence. Apple Intelligence will be released on iOS 18, iPad OS 18 and MacOS "Sequoia" this fall.
Not all Apple products support Apple Intelligence, only Mac and iPad products with M-series processors (all the way up to M1) and last year's iPhone 15 Pro with A17Pro processor. In addition to the neural engine in Apple's chips requiring a certain level of TOP (tera operations per second) performance, Apple also appears to need at least 8GB of DRAM to house the GenAI model (iPhone 14's A16 Bionic processor, iPhone 14). The Pro only has 6 GB of DRAM, but with a 17 TOPS Neural Engine, more than the M1 and M2 processors).
Because Apple Intelligence will be deeply integrated into the operating system and able to take advantage of your personal context, Apple hopes that it will become a tightly integrated part of the Apple product interface. The company will also use Apple Intelligence to improve the capabilities of its Siri personal assistant, providing better natural language processing and on-screen context understanding. Apple Intelligence also has several fashion, including voice, written language and images.
Apple will use AI to improve Siri's richer language processing and a better understanding of the relevant equipment environment. Apple will also add typing as input to Siri. The improved Siri can be used for feature descriptions to control functions or to find information, even across apps. It will also have on-screen awareness of the information on the display screen when in use.For example, while Apple didn\'t focus on GenAI doing the creative writing for you, the company talked about how Apple Intelligence will be able to rewrite, proofread and summarize existing work. This may be Apple's way of avoiding the problem of replacing creative professionals with GenAI.
Another example is Apple Mail, where you can use the GenAI feature to summarize an email instead of just looking at the first few lines. It can also prioritize important messages and notifications and reduce notifications for less important messages.
One of the more interesting uses of this new Apple Intelligence is the creation of custom AI-generated emoji characters, which Apple calls "Genmoji." Image Playground will generate these custom emoticons using a description (written or spoken) that describes the image and style. Image Generator can use your own image library for image generation. Apple Intelligence will enable natural language image searches for photos and videos and let you create your own memory movies based on your inputs. While Apple Intelligence has some clear boundaries around functionality and privacy, Apple provides support for third-party GenAI solutions like ChatGPT in Siri. However, users must explicitly grant permissions to ChatGPT. Apple's authoring and imaging tools, such as Compose, support ChatGPT for free and provide access to premium ChatGPT subscriptions directly through Apple's interfaces.
Through a new collection of APIs and the AppIntent framework, third-party applications will also be able to use Apple Intelligence GenAI services on the device.
The goal of Apple Intelligence is to make your interface with the PC, phone or tablet grounded with personal information and real-time screen information capture. This gives the local AI context using real time and personalized information. Privacy can be maintained by keeping the data on-device.
Apple Intelligence is based on compact large language and diffusion foundation models that can access an in-device sematic index of the personal data. More details can be found in this Apple blog post. The basic Apple Intelligence model on the device is about 8 billion parameters. Apple uses a low-rank adaptation (LoRA) adapter for a mix of 2-bit and 4-bit integer quantization to reduce model size and memory requirements without losing accuracy.
If the local model cannot handle the complexity of the request. Apple has created a special server for a larger model called private cloud computing (PCC). Apple uses its silicon for these servers. This detailed Apple blog post provides more details. The process is that if there is not enough AI in Apple's evaluation device, the relevant data will be sent to the PCC for processing. Apple has taken a very conservative approach to artificial intelligence, adding easily integrated enhancements to existing features in its Mac, iPad and iPhone products. Support for older products is also built on a larger installed base, which will appeal to developers. What was missing from the supported products was the Apple Vision Pro XR headset. Here’s a place where adding AI support for hands-free task directives would be greatly appreciated. In mixed reality, generative imagery would also be even deeper. We'll have to wait for the fall release of Apple's Intelligence features to see the impact on battery life and how it compares to Microsoft's Copilot+ PCs.