Apple showed new versions of the iPhone and other gadgets. We tell the main thing about the new products and answer the question "What's up with Apple Intelligence?
The iPhone 16 Pro received enlarged screens (6.3 and 6.9 inches), an A18 Pro chip with a 35 TOPS neuroprocessor and a separate camera touch button. In the camera app, it will allow you to take photos, switch between modes and control the zoom. It can also be used to trigger a visual search similar to a Yandex Smart Camera. The cost of the iPhone 16 Pro starts at $ 999 (excluding taxes).🔸 In the iPhone 16, the camera unit is now positioned vertically, and there are also action and camera buttons, as in the iPhone 16 Pro. The base model uses a less powerful A18 chip. The cost starts from $ 799.
🔸 Two versions of AirPods 4, the older one has active noise reduction and wireless charging, both versions received USB-C instead of Lightning. The cost is 129 and 179 dollars, respectively.
AirPods Pro 2 will receive support for functions that allow you to use headphones as a hearing aid. Apple plans to certify it in 100+ countries.
The Apple Watch Series 10 will receive an enlarged screen and a thinner case, the cost starts from $ 399.
Apple Intelligence will be available in October, and next year it will receive support for Chinese, French, Japanese and Spanish.
One of the main topics of the presentation was the neural network functions of the iPhone — this became possible thanks to powerful neuroprocessors. We explain why they are needed and how Apple Intelligence will work.
What is a neuroprocessor?
A neuroprocessor or NPU (English Neural processing unit) is a part of a chip or a separate chip, the architecture of which is optimized for very fast operation of machine learning algorithms, including neural networks. This allows you to run even quite "heavy" algorithms directly on the device, for example, large language models. Due to this, the operation of the algorithms does not depend on the quality of the Internet, and the processed data is not transmitted to the servers.
Does Apple Intelligence run on a neuroprocessor?
Apple has chosen a hybrid approach: some of the functions work directly on the device, and the most demanding computing resources are in the cloud. The company claims that it designed Apple Intelligence servers so that data is deleted immediately after processing, and no Apple employee had access to them.
Will Apple Intelligence only work on iPhone 16?
The system will be available on last year's Pro line (iPhone 15 Pro and Pro Max), as well as on Macs with M1 chips and newer. Moreover, the iPhone 16 will be released without Apple Intelligence support: the first features will begin to appear in iOS 18.1, which is expected to be released in October.But you can try Apple Intelligence now if you install the beta version of iOS 18.1 on iPhone 15 Pro or Pro Max and select English as the system language.
Is there something similar in other smartphones already?
Yes. Some flagship Android smartphones, such as the Google Pixel 8 Pro and 9, as well as the Samsung Galaxy S24, use the local Gemini Nano language model. In addition, Google Photo has a feature for removing objects from photos, an analog of which is being tested in iOS 18.1.At the same time, each platform has unique functions: for example, in Google Pixel 9 and Galaxy S24 it is possible to realistically finish drawing objects in a photo using a neural network, and in iOS ChatGPT integration will soon appear directly into the system, due to which the neural network can be used in any input field.
Follow us On Telegram



0 Comments