IPhone 13

Why Apple’s processors need the Neural Engine. The third power of iPhone magic

Why Apple’s processors need the Neural Engine. The third power of iPhone magic

Starting with the iPhone X, Apple adds the prefix Bionic to all of its A-series chips. The natural nature of the word is reinforced by a particular module inside the processor called the Neural Engine.

It consists of computing units that help the iPhone solve context-related problems.

The Neural Engine is not tasked with solving an equation or displaying an image. It is a much more subtle technology without which our smartphones would remain dull and unable to recognize us by sight.

Below I’ll tell you why Apple boasts about developing this element in its chips with the same frequency as CPUs and GPUs. I’ll explain precisely why the Neural Engine is needed, including in the MacBook. And let’s also trace how the neural kernels inside the iPhone became the third necessary force after computing and graphics.

How the chips are set up. CPUs and GPUs are separate on PCs and together on mobile devices.

Most PCs, especially home PCs, have two main chips: the CPU and GPU. They are physically located in different parts of the system unit and, in most cases, are created by other manufacturers.

CPU stands for Central Processing Unit and translates to “central processing unit.”

It is responsible for the complex and parallel processing of instructions. Without it, the computer would not work.

GPU stands for Graphics Processing Unit and stands for Graphic Computing Unit.

Forms and displays visual information on the monitor beautifully displays the processor’s calculations and operating system (OS).

Sometimes the GPU is built directly into the CPU. Such an element is weaker and often serves as a temporary substitute, so you can see what is happening inside the computer while looking for a discrete graphics card replacement. These come in separate modules.

In compact devices, the ironworks not only deliver powerful results but also conserve power. The device must not overheat due to excessive heat from a large computing load.

Here, the integrated graphics in the CPU is already the main. Additional processors are usually either cut off in power or work entirely only when connected to the network. In smartphones and tablets, there is no discrete GPU module in principle.

The only control chip becomes something more significant than a regular CPU. It becomes a complex system.

And Apple has already achieved a leading position in this, so it started integrating a new type of computing transistors into the processors.

There’s the CPU, there’s the GPU, and there’s the Neural Engine.

The story started with the iPhone processors, which gradually integrated new elements. The Neural Engine, for example. It speeds up the camera’s intelligent features, helps you analyze your voice better for Siri, and enables you to recognize people faster in your photos.

A little later, with the same approach, Apple made a comprehensive M1 chip for its Macs. There are even more of these cores inside it, and they help with the same tasks as in the iPhone every day.

A separate chip was not put in consumer devices for these purposes, but it gradually got a similar name.

NPU stands for Neural Processing Unit and is not yet commonly translated into Russian.

The term gradually comes into use, but each manufacturer calls it in its way. We will rely on the Apple formulation, which sounds like Neural Engine or “Neural Processor.”

It speeds up the recognition of speech, identifying human figures and specific personalities, the exact determination of cat breeds or types of colors, tracking objects in real-time, and quickly solves the problem of intelligent recognition of one among many.

A condition must be met: the system on this type of processor must first be trained on other examples.

The first Neural Engine was used exclusively for Face ID.

Apple has had complete control over the production of its processors since the iPhone 4 and the A4 chip inside it, which came out in 2010.

The company was among the first to implement new technologies in chips on mobile devices. For example, the transition to a 64-bit bit system, the integration of a motion and photo coprocessor, and the use of minimal tech processes (7 nm, 5 nm).

A critical year was 2017, when the iPhone X was released. At its presentation for the first time, there was talk about a small neural network module inside the A11 chip, which Apple first and still calls with the prefix Bionic.

Then this little block inside the chip was given a drop of attention to tell us how it juxtaposed a natural face with an artificial copy in the form of a theatrical mask. All we’ve learned about the Neural Engine is that it can learn from a user’s appearance, and that’s it.

With each new generation, the NPU has evolved considerably and been trained with new capabilities.

In 2021, it helps search for specific people and create memories in the Photos app, analyze 40 facial expressions in real-time, and create Movie Effect in the iPhone 13.

How the Neural Engine evolved in the iPhone
Every Apple presentation is not without a mention of the coprocessor and how much its performance has grown.

Below will be a complete list of iPhones from 2017, describing their features and improvements to the Neural Engine and its cores.

iPhone X
The post A11 chip inside the iPhone X technology appeared first.

Two cores that served solely to recognize faces correctly.

Performance: 600 billion operations per second.

iPhone XS and iPhone XR
The A12 inside the iPhone XS and iPhone XR put eight neural cores, to which they added machine learning in applications other than Face ID and a “smart computing system.”

This system recognizes the type of task coming into the processor and decides which blocks to process it: in the Neural Engine, CPU, or GPU.

Machine learning has improved:
– Keyboard word suggestions.
– A selection of photos in Memories
– Display of valuable places in Maps
– True Tone screen adaptation
– Photo search in Photos.

And the second-generation Neural Engine also enabled real-time machine learning.

It was needed for single-camera portrait mode on the iPhone XR and AR camera effects. For example, to superimpose stage lights or track the movement of 50 facial muscles during a FaceTime call.

iPhone 11 and iPhone 11 Pro
The A13 inside the iPhone 11 features the Neural Engine again of 8 cores.

Then the unit was boosted by 20% more power and reduced consumption by 15% by selectively feeding power to different areas of the A13.

The NPU improved speech recognition and became faster at tracking facial expressions in real-time along with machine learning.

They added machine learning acceleration blocks to the CPU, which became eight times faster at performing matrix calculations. They were probably the ones that gave the neural kernels acceleration.

Performance: 6 trillion operations per second.

iPhone 12 and iPhone 12 Pro
In the A14 of the iPhone 12, the Neural Engine became twice as big and consisted of 16 cores.

The NPU was 80% faster than the one in A13.

These improvements helped bring Deep Fusion to all iPhone cameras, including the front and ultra-wide-angle cameras. The latter, despite being physically identical to the last generation, began capturing sharper photos.

The same NPU is found in all of Apple’s M1 series Mac chips.

Performance: 11 trillion operations per second.

iPhone 13 and iPhone 13 Pro
The A15 of the iPhone 13 Neural Engine remained with the same 16 cores. But they have become noticeably more powerful. Even the new M1 Pro and M1 Max don’t have that.

At this presentation, Apple reminded us that the machine learning elements installed in the Neural Engine work efficiently due to the blocks of their acceleration in the CPU section.

It is currently the most powerful NPU among Apple products. Its main feature was the recording of video with deep blue. The model in the Camera app in any iPhone 13 is called Cinema Effect.

It also helps Siri recognize dictation and correct the navigator inside Apple Maps, do real-time object tracking, identify plant species, and speed up text translation from photo to written text.

Next Post
/* ]]> */