A Seamless Whole

I don’t know if you noticed, but the visual design language that Apple introduced with iOS 7 has been around for longer on the iPhone than its Mac OS X-like predecessor. MacOS has twice been redesigned from top to bottom in the post-iOS 7 era, and Apple has introduced a few more operating systems in that timeframe. I am not sure there will be another wholesale redo of the iOS look-and-feel for years to come. But that does not mean there have not been subtle adjustments in that time. I want to focus on a tiny change in how Apple treats the edges of the screen because I think it is indicative of a slight evolution in how it treats the interaction of hardware and software on the iPhone.

iOS 7 was introduced in 2013. At that time, the Android market was making the shift from LCD displays to AMOLED. In fact, if you look at 2013 smartphone roundups, you’ll find an almost even split between models with LCDs and those with some flavour of OLED. But those early OLED displays weren’t great for wide viewing angles or colour accuracy, so every model in Apple’s iPhone and iPad lineup had an LCD display. Those differences materialized in radically different approaches to visual interface design.

In the preceding years, Google gradually shifted Android to darker design elements, culminating in the 2011 release of Android 4.0 with its “Holo” theme. While there were still some lighter elements, the vast majority of interface elements were rendered in white or electric blue on a black field. Since OLED displays control brightness at the pixel level, this created a high-contrast look while preserving battery life. This was important as Android device makers shipped phones with wildly different hardware specifications, often using more power-hungry processors, so Google had to create a design language that would work with configurations it hadn’t even dreamed about.

Apple, meanwhile, has always exercised more control over its hardware and software package. Because every iPhone had a calibrated LCD display with an always-on backlight, Apple’s design language went in the complete opposite direction of Google’s. iOS 7 made liberal use of bright white rectangles and a palette of accent colours. When iOS 7 was released, I wrote that “if Apple used AMOLED displays […] the interface would be much darker”. Indeed, when Apple released the Apple Watch, its first OLED display product, it featured a visual interface language that is (and remains) dark.

After Apple began shipping iPhones with OLED displays, it introduced a systemwide dark mode in iOS. Dark mode is not limited to OLED display devices; non-OLED iPhones and iPads can toggle dark mode, but it doesn’t look as good. You will note that the Apple Watch does not have a light mode and, even today, there are only a handful of watch faces and app screens that allow the edges of the screen to be clearly defined.

The Apple Watch also marked a turning point in hardware design. The screen’s curved cover glass or sapphire blended seamlessly into the metal body. And, because of the true black of an OLED panel, it became difficult to tell where the edges of the screen lay and the black bezel began, creating a sort of infinity pool effect. Apple’s design guidelines explicitly said to treat them as a unified whole:

Apple Watch was designed to blur the boundaries between device and software. For example, wearers use Force Touch and the Digital Crown to interact seamlessly with onscreen content. Your app should enhance the wearer’s perception that hardware and software are indistinguishable.

When Ian Parker of the New Yorker interviewed Jony Ive around the time of the introduction of the Apple Watch, Ive expanded upon this philosophy:

The Apple Watch is designed to remain dark until a wearer raises his or her arm. In the prototypes worn around the Cupertino campus at the end of last year, this feature was still glitchy. For Marc Newson, it took three attempts — an escalation of acting styles, from naturalism to melodrama — before his screen came to life. Under normal circumstances, the screen will then show one of nine watch faces, each customizable. One will show the time alongside a brightly lit flower, butterfly, or jellyfish; these will be in motion, against a black background. This imagery had dominated the launch, and Ive now explained his enthusiasm for it. He picked up his iPhone 6 and pressed the home button. “The whole of the display comes on,” he said. “That, to me, feels very, very old.” (The iPhone 6 reached stores two weeks later.) He went on to explain that an Apple Watch uses a new display technology whose blacks are blacker than those in an iPhone’s L.E.D. display. This makes it easier to mask the point where, beneath a glass surface, a display ends and its frame begins. An Apple Watch jellyfish swims in deep space, and becomes, Ive said, as much an attribute of the watch as an image. On a current iPhone screen, a jellyfish would be pinned against dark gray, and framed in black, and, Ive said, have “much less magic.”

Alan Dye later described to me the “pivotal moment” when he and Ive decided “to avoid the edge of the screen as much as possible.” This was part of an overarching ambition to blur boundaries between software and hardware. (It’s no coincidence, Dye noted, that the “rounded squareness” of the watch’s custom typeface mirrors the watch’s body.) The studio stopped short of banishing screen edges altogether, Dye said, “when we discovered we loved looking at photos on the watch, and you can’t not show the edge of a photo.” He laughed. “Don’t get me wrong, we tried! I could list a number of terrible ideas.” They attempted to blur edges, and squeeze images into circles. There was “a lot of vignetting”—the darkening of a photograph’s corners. “In the end, it was maybe putting ourselves first,” he said.

This philosophy has continued through to the present day WatchOS 7. Consider the app launcher, too: as you drag your finger around, the icon bubbles toward the edges of the device shrink in a way that almost makes it seem like you are rolling your finger around a sphere instead of moving dots laterally.

The same design language was present on the iPhone 6, albeit to a lesser degree of curviness. It may not have been equipped with an OLED panel, but Apple kept developing screen coatings that made blacks blacker and increased contrast. More noticeably, the edge of the glass curved to blend into the rounded edges of the body, establishing the hardware design language for iPhones up until this year.

The original iPhone had a chrome bezel to hide the unresolved connection between the display glass and the aluminum and plastic chassis. Ever since the iPhone 4 and 5, in particular, the physical seams have slowly disappeared. On my iPhone X, there is only a thin black plastic gasket between the face and the stainless steel body, and it is part of a constant curve. My thumb feels only the slightest groove when moving from glass to metal and back again.

iOS’ design language evolved at a similar time as the hardware. The iPhone 6 was introduced in 2014, a year after iOS 7 debuted, but that language felt much more at home on its bigger and curved displays. Notably, the iOS 7 design language incorporated edge-to-edge elements throughout — in notifications, table views, and so on — that seemed to bleed into the bodywork, as did the new gesture for swiping from the left edge of the display to go back.

But iOS has slowly pulled back on the design elements that tried to create an illusion of a continuous product with little separation of hardware and software. Several years ago, notifications, widgets, and Siri results were redesigned as glass bubbles rather than full-width blocks. iOS 13 marked the return of the grouped table view. Even iOS 14’s “full width” widgets are narrower — their size matches the margins of home screen icons, but it also fits a pattern of increasing space around onscreen elements.

Perhaps that is also related to its hardware design language. The bezels of iPhones keep shrinking, so there is a smaller hardware border around the display. This is compensated for in software by increasing the amount of space between onscreen elements and the edge of the display. Otherwise, software elements would appear too wide and overwhelm the physical size of the product — particularly with the larger Max models. It strikes me that these changes mark an evolution of that thinking: the hardware is, effectively, seamless, but it is not edge-less. From a visual design perspective, at least on the iPhone, Apple seems more comfortable delineating the edge of the software and the hardware instead of letting them bleed together to the same extent.

Apple often emphasizes that it develops hardware and software in sync, with the goal of building more unified products. I think we see that in these subtler interface details.