At its annual Worldwide Developers Conference on Monday, Apple unveiled a series of major new features, ranging from an ambitious visual overhaul to advances in its artificial intelligence offering, dubbed Apple Intelligence. The Cupertino-based company also announced that it would open up the underlying technology behind this AI suite to developers.

Here are the main announcements from the event:

New visual identity with "Liquid Glass"

Apple is introducing a new design language called "Liquid Glass," which will bring a translucent and glossy aesthetic to app interfaces, evoking a fluid glass effect. Inspired by the visionOS design on the Vision Pro headset, this style reacts dynamically to light, dark or light mode, and user movements via real-time rendering.

Buttons, sliders, media controls, tab bars, sidebars, and toolbars will benefit from this redesign. Apple will provide developers with updated APIs to enable them to adapt their apps before the planned rollout later this year.

New operating systems and name change

Apple is abandoning its traditional numbering system for iOS updates. The successor to iOS 18 will not be called iOS 19 but iOS 26, in reference to the year following its release, similar to the automotive industry.

This new version introduces a major graphical overhaul. The Phone app will now be able to filter calls, answer them for you, or put them on hold. The Messages app gets customizable wallpapers for chats.

Apple is also enhancing the capabilities of Xcode, its development environment, with generative AI. Xcode will integrate tools capable of generating, testing, and correcting code, notably adding the ChatGPT model.

Apple Intelligence is gaining momentum

Apple's artificial intelligence is gaining new features, such as Live Translation, which allows conversations to be translated in real time in text messages, phone calls, or FaceTime, all locally on the device.

Apple Pay will now be able to track orders, even for purchases made outside its own platform, thanks to the integration of Apple Intelligence.

The Image Playground image generator will also see a significant improvement, with the ability to create visuals using ChatGPT.

In a strategic move, Apple will allow developers to access its foundation model embedded in devices via the new "Foundation Models" framework to create intelligent and privacy-friendly experiences, even offline.

Visual Intelligence to enrich the user experience

Apple is introducing a new feature called Visual Intelligence. It will allow users to learn more about the content displayed on an iPhone screen by offering, for example, searches for similar images on Google, Etsy, or other partner apps.

When an event is detected, the system will automatically offer to add it to your calendar. This feature will be accessible via the same shortcut used to take a screenshot.

With this series of announcements, Apple continues its shift toward a unified user experience focused on artificial intelligence, privacy, and a redesigned aesthetic. In his daily note at Jefferies, analyst William Beavington comments: "The main focus of Apple's WWDC25 in terms of technology development has been on liquid glass, with much less time devoted to Apple intelligence this year compared to last year. It will have a translation feature in messages and Facetime, which is not that innovative. Samsung has had this AI feature for two years now."