About Products, ALL

What’s so good about the iPhone screen? They’re both OLED, but Apple did it right…

Preface

I would like to share with you some of the better display effects of Apple products: for example, color management, which can render in the correct color space according to the image or video configuration file; and local HDR, which can take advantage of the high brightness characteristics of OLED or miniLED to allow images or videos to restore the brightness of the scene as much as possible.

But I don’t know if you have noticed a problem: these two functions are essentially doing one thing, which is to create a container with sufficient brightness and color, transfer all the content in, and then convert it into a data format accepted by the screen. For the pixels of the screen, it only accepts one data format, that is, RGB signal and current brightness. This part is completed in the system without the participation of hardware. I would like to call it the art of operating pixels.

But the topic we are going to share today is more basic than this. Because we all know that for each pixel in the content, it actually requires several RGB pixels to be mixed and displayed on the screen. This is the sub-pixel or sub-pixel of the screen.

How is the sub-pixel driven for display? What is the difference between OLED and LCD? What has Apple accumulated in this regard? Then I will talk about it in detail.

Subpixel Rendering

Let me give you some simple science first. OLED pixels are not RGB 1:1:1. Like the mainstream diamond arrangement, red and blue pixels are half less. At this time, if you want to display a white dot, it may not be enough to light up only the three pixels of RGB. You need to borrow pixels from the side to ensure that the color and shape are restored as much as possible. This is the sub-pixel rendering algorithm.

If it is done well, you will not feel the inherent disadvantage of OLED lacking red and blue pixels. If it is not done well, it will appear blurry or jagged, and it will not look clear enough. So this function is one of those functions that the better it is done, the less you will feel it.

But what you may not know is that although LCD is a standard RGB arrangement, it also needs to use similar means to smooth curves or rounded corners, such as ClearType and integer scaling on Windows, and HIDPI on Mac, all of which are solving similar problems.

These functions can be customized by software to adapt to different displays, but due to stricter power consumption restrictions on mobile terminals, similar algorithms must be solidified in hardware. This hardware is the display chip, also called DDIC. It is not very eye-catching and is hidden behind the display, but you can usually find it in the disassembly video of boss Lou Bin.

This is also the last part that can be controlled by the software. The software system cannot light up a “sub-pixel” individually through commands. Only DDIC has this function. So no matter what content or algorithm you talk about, you may not be able to avoid this chip in the end. Today I will briefly introduce it, and we may mention it again in our future videos. After the popular science, let’s get to the point.

In order to demonstrate the effect of sub-pixel rendering, I picked up HTML development that I had given up for many years and wrote a webpage to test sub-pixel rendering. I would also like to thank Mr. Navis for his creativity. The reason why I used a webpage instead of an image is that the screen resolution of each device is different. Some devices have 1080P, some have 2K, and some have non-standard resolutions like Apple. To ensure that the size of the test pattern displayed on each device is similar, use the same font, and is not affected by image compression, it is most reliable to draw it with code. Although this picture looks relatively simple, the effect has been achieved.

The implementation is very rough, so I hope you developers will be gentle. You can try the display effect on your own mobile phone. Then I would like to remind you again that the sub-pixel rendering algorithm is actually completed by the DDIC chip factory and the screen factory, so the difference here is essentially between Samsung screens and domestic screens, LCD and OLED, and different resolutions.

So please don’t worry about what model of mobile phone I am using, which is actually irrelevant to this test. Then let’s take a look at the performance of the 1080P LCD screen.

Let’s first look at the upper part of the test pattern. At first glance, the display effect is very good, the text is clear and not blurry, and the straight and diagonal lines with an interval of 2 units can also distinguish the outline.

After all, it is LCD, so the basic effect is definitely no problem, but looking down at the display effect of the icons, for example, the icon of the new note has inconsistent widths on the left and right and up and down, and then look at the headband of the headphone icon, there are also obvious jagged edges.

We use a microscope to zoom in and see the actual sub-pixel rendering. Those half-bright pixels are the pixels borrowed from the side for smooth display. However, LCD borrows complete RGB pixels and cannot borrow single sub-pixels. In the sticky note icon, the icon can be displayed completely without borrowing pixels. After borrowing, the width of the four sides is unequal. On the headphone icon, you can see that the processing is not aggressive enough, especially the upper edge of the headband, which is not borrowed, resulting in the curve not looking smooth enough.

Then we will compare the performance of 1080P OLED with these two icons, and put the domestic screen and Samsung screen together to see the difference. First of all, the sticky note icon seems to have the same problem of uneven width up, down, left, and right. Here the performance of the domestic OLED seems to be better, but the curve of the headphone icon looks smoother on the Samsung OLED and performs better than the LCD.

We compared the performance under a microscope. When the domestic screens borrowed sub-pixels, the curve at the upper edge borrowed several more groups of pixels downward, and the red pixels were very bright and conspicuous, and encroached too much on the black area, thus interrupting the continuity of the curve.

And if we go back to the test picture and look at the large icons in the lower row, the Samsung screen still has an advantage, but the domestic screen makes a comeback when displaying text, it looks sharper, and there is less edge color fringing caused by borrowing pixels, so it can be said that there are pros and cons.

And it can also be seen from this that as long as the sub-pixel rendering algorithm is done well, the difference between OLED and LCD is not obvious. In some scenes, because the sub-pixels can be more finely controlled to fill the edges, OLED is even better than LCD in terms of smoothness. OLED is no longer the OLED of the past, just like CMOS was not as good as CCD in terms of quality, but it eventually eliminated CCD. Times have indeed changed.

Then let’s take a look at the comparison between the two 1.5K and 2K level screens. Here, we will follow Redmi’s statement and count the iPhone as 1.5K. First of all, the clarity of these three screens is much better than that of the 1080P screen, especially the recognition of straight lines. Due to the higher pixel density, the sub-pixel rendering algorithm has more room to play, and even the display of a unit point is clearer than that of 1080P.

Observing the pixels of several screens under a microscope, we can find that 1.5K is indeed a watershed, and the displayed points are closer to a square. Let’s take a look at the green lines with 1 unit interval. The dividing lines between them are also clearer than those of the 1080P screen.

If it is difficult to distinguish between the three screens, then looking at the icon effects below, the iPhone has a huge advantage. The sticky note icon has equal width on all four sides, and the headphone icon is extremely smooth, which even exceeds the 2K screen with a higher resolution.

Zooming in to see the details, the iPhone uses more sub-pixels to fill the black area, and the non-luminous area is the smallest. However, unlike domestic 1080P screens, the brightness of the filled sub-pixels is very low and better balanced, so the curve will appear smoother to the naked eye.

In fact, we can make a summary after seeing this. Under 1080P, LCD screens and Samsung’s OLED have their own advantages and disadvantages, but both are slightly better than domestic OLED screens. This is mainly reflected in the fact that LCD fonts are clearer, but Samsung’s OLED curve processing is smoother. At the 1.5K level, the performance of domestic 1.5K, iPhone and Samsung’s 2K is roughly the same, but in terms of the smoothness of icon display, iPhone shows a relatively obvious advantage.

Of course, the test charts here are under very extreme conditions and are specially made to find problems. In fact, when you use your mobile phone, the clarity difference of the content itself may be far greater than the advantages and disadvantages of the algorithm.

But it must be said that the rapid development of domestic screens in the past two years has made the display effect very outstanding. In the early years, Samsung’s diamond screen was clear and the advantages of good display effect were gradually eliminated. In particular, I said in the K50U video that 1.5K screens must replace 1080P screens in mid-range products. The test results today also prove this point. Furthermore, high resolution does not necessarily mean clarity. The sub-pixel rendering algorithm can eliminate the advantage of clarity or even surpass it.

Here we have to make an analogy with a camera. A good CMOS can only determine the lower limit. Only with good algorithms and debugging effects can the overall upper limit be raised. I guess some friends may say that this is too extreme. I think 1.5K is good. Anyway, there is no difference in actual use. I believe most people should think so.

Indeed, as a video of the first phase of the moat, if only this layer is dug, Apple’s advantage does not seem to be obvious enough, but you say, Apple spends so much money every year to find a screen factory to customize the screen exclusively, what is its strength? Is there any detail we haven’t discovered?

Ability Demonstration

Now let’s take a closer look at Apple’s screen. Whether it’s compared to a 1.5K domestic screen or a Samsung 2K screen, have you noticed that Apple’s pixels seem to be larger, denser, and smaller? In fact, no matter which OLED you compare, the iPhone screen has the largest pixels. The scientific name for this is the “aperture ratio” of the pixel.

There are many benefits to increasing the aperture ratio. For example, for the same voltage, the brightness of pixels with a large aperture ratio will be higher. Conversely, achieving the same brightness will save more power and improve the life of the screen to a certain extent.

I wonder if some friends will ask at this time, why is there so much advantage in increasing the aperture ratio, even Samsung’s own 2K screen has not followed up, only Apple is the only one?

To be honest, the answer is actually very simple, that is the yield and cost. The production of OLED screens is fixed by evaporation. The three types of RGB pixels need to be evaporated three times. Each evaporation will use a mask. The pixel is fixed successfully when it falls on the substrate through the mask. Therefore, in order to ensure the yield and reduce bad pixels, the opening of the mask should be slightly larger than the pixel.

However, if the pixel aperture ratio is large and the pixels are dense, the size of the mask opening is very contradictory. If it is too small, the pixel cannot be fixed on the substrate and becomes a bad pixel. If it is too large, it falls on the adjacent pixel and the color is wrong. In short, this is an industry problem. I don’t know how much time and cost Apple spent to solve this problem, but the final result is that Apple’s screen has a completely independent production line at Samsung, and the brightness of the iPhone screen has become an industry benchmark, and the power consumption is also very good. This may be the power of money.

After all, Mr. Cook is a master of supply chain management. He cannot allow the cost of the screen to expand indefinitely. He must have thought of something to reduce the cost of the screen. What did he think of? We can actually find the answer from a parameter, that is, the “PPI” of the screen.

In fact, Apple is really a very different company. Their products are defined by screen size, not resolution like others. The resolutions of Apple products are all very strange, but if you look closely at the parameters, you will find that their PPIs are the same or close. Looking back at all the mobile phones released by Apple in the past decade, after upgrading to the Retina display, LCD has only appeared in the numbers 326 and 401. If I remember correctly, OLED has only three numbers: 458, 460, and 476. I won’t list iPad and Mac one by one.

From the user’s perspective, when choosing a product, the screen size is enough to consider. Screens of different sizes have similar clarity and display effects, and the memory point that Apple screens are good is implanted in the minds of users. From the perspective of cost control, as long as the PPI of the screen is the same, it can actually be considered as one screen, because it is cut from a large panel during production.

Even screens of different generations, like this year’s 14 Plus and last year’s 13 Pro Max. Although the refresh rates are different, the sizes and PPI are exactly the same. There must be many processes that can be fully reused in production, which further saves costs.

And there is a very hidden cost here, which is the research and development cost of sub-pixel rendering. This algorithm is actually strongly related to PPI, because Apple’s screens are all customized exclusively, and the algorithm for sub-pixel rendering is also developed by itself. Screens with the same PPI use the same algorithm, so there is no need to reinvent the wheel, and more focused optimization effects, which is indeed killing two birds with one stone.

The most important thing is that if you have the ability to control sub-pixels, it is equivalent to opening the door to a new world. Because you can also do another thing, that is “screen burn-in compensation.” This year, everyone has seen that the iPhone’s AOD is very different from Android. The screen does not turn off and directly displays the lock screen page without moving. In theory, this is of course more likely to cause screen burn-in than Android’s AOD. But on the one hand, the pixel aperture rate is larger, which increases the life of the screen. On the other hand, Apple has its own compensation algorithm.

At this point, I would like to remind you that OLED screen burn-in compensation is a very core secret. I can only share some experimental results and speculations with you. Please listen to it as a reference. Friends who have in-depth research on this are also welcome to communicate with me in the comment area and private messages.

Three years ago, we did a lot of aging experiments with XS Max and found that the life of Apple screens was much better than that of Android phones at that time. It was very difficult to show signs of screen burn-in. This aroused our great curiosity. After cooperating with domestic screen manufacturers to do technical analysis, we found that the flash memory chip that Apple plugged into the OLED screen may be the key here.

We speculated that one of the functions of this chip is to store the luminous time of each sub-pixel and compensate the brightness according to the life curve of the pixel itself. Note that it is for each sub-pixel, which is about 8 million data. The result of this is that unless it is very severely aged, you will hardly see the signs of screen burn-in. However, as the use time increases, the peak brightness of the screen will decrease to match the brightness of the most aged pixels.

This may also be the reason why Apple started to do screen certification in 2019. Only with official screens can this information be recognized and the burn-in compensation function can be enabled normally.

Frankly speaking, this approach is very smart and well thought out. It not only ensures the control of official repairs, but also allows the luminous time to be tied to the screen. For the same screen, no matter how long it is used or which phone it is replaced with, its display effect will not change.

The reason why we speculate that Apple will do this is that the iPhone does perform very well in the aging test, and also because Apple is really capable of doing this, and it is difficult for others to learn: flash memory chips and DDICs must all be customized, sub-pixel rendering and burn-in compensation algorithms are highly coupled, and must be considered comprehensively, and the aging curve of each pixel needs to be measured in advance. Each point here may be a mountain for other products, but Apple has them all in its own hands, and finally transformed into a unique function, which is the AOD display of iPhone and watch.

This is the power of Apple’s vertical integration.

However, I must add that Samsung’s own mobile phones actually have the opportunity to do it, but judging from the results of our aging test that year, their performance is very ordinary, and the current AOD is not richer than other Android phones, so I guess there are still difficulties that have not been solved.

Summarize

It’s time to summarize again. In fact, after analyzing Apple’s products, I often have a feeling that being a product manager at Apple may be a very easy job, because there are not many restrictions, you just need to focus on the right things, and the technical team is strong enough to implement your ideas.

Focusing on the right things and considering the solutions in detail and completely, isn’t this the most difficult part of the product manager position? Without the restrictions of external conditions, it actually tests the ability of product managers more, because you don’t have any interfaces.

Just like the sub-pixel rendering and a bunch of related functions we analyzed today, I should be able to think of sub-pixel rendering and anti-burn-in, but the official screen certification is likely to be ignored, and if you think about it carefully, the underlying principles of sub-pixel rendering and zone dimming on miniLED are exactly the same. Aren’t they all for ensuring smooth edges? How much does the zone dimming algorithm borrow from it?

So I always feel that the more I study Apple, the more I feel that Apple is everyone’s teacher. It often provides the best solution, which forces you to imitate it. Sometimes it is even difficult to imitate it. How can you ridicule others for not innovating?

Take the anti-burn screen as an example. Now everyone is researching solutions, but the problem at hand is that sub-pixel rendering algorithms, DDIC function customization, and pixel aging research are all hard bones. Sometimes even if you have an idea, you will find that you are restricted by others and cannot cover everything.

However, the strong rise of domestic screens has actually brought changes to the industry. As I mentioned in the K50U video before, domestic suppliers are very open, and the technical exchanges during cooperation are also very sincere. They are willing to work together to improve the display effect of the screen.