From improvements in the efficiency of OLED materials to software developments and new testing techniques, OLED burn-in risk has been lowered. OLED monitors are generally a more sound investment than ever—at least for the right person.
From improvements in the efficiency of OLED materials to software developments and new testing techniques, OLED burn-in risk has been lowered. OLED monitors are generally a more sound investment than ever—at least for the right person.
The TL;DR is now pixels get tracked for how long they’ve been lit. Then the device can evenly burn out the other pixels so the usage is uniform. The trade off is you are going to lose max brightness in the name of screen uniformity.
The other point is a shifting of the TFT layer that people think is burn-in, but it’s actually image retention. But, it is solved by these TV maintenance cycles. Just hope that this compensation cycle actually runs since some panels just fail to run them.
Checkout this RTings video for a good overview of lots of different TV brands and how they perform.
PS: Burn-in is actually a misnomer from the CRT era. There’s nothing burning in; the pixels are burning (wearing) out.
Thank you for the summary. My takeaway is: So you’re saying I should still get a mini LED TV
I have both:
an 85" TCL R655 with a bunch of dimming zones that works great in my sunlight-heavy living room for both daytime viewing and family movie night.
a 55" LG C1 in my gaming/home-office/theater room with blackout curtains that is great for PC gaming and awesome theater experience.
I would say it depends on your viewing environment. The inability of an OLED to get bright can ruin the experience. But my game room has blackout curtains and it’s enclosed.
I just recently moved from 34" Ultrawide to just mounting the 55" onto my desk. It’s oversized for my viewing distance, but 4K resolution is 8million pixels so I rarely run apps in or near fullscreen anymore. I think a 42" LG OLED is perfect for PC. (Great out of the box calibration and 120hz G-Sync). Though QD-OLED on Samsung are technically better, I don’t trust them to run compensation cycles.
If you’re worried about burn-in on PC, just set a screensaver to black your screen in 2 to 5 minutes. That’s why they were invented anyway (CRT era). For regular media consumption it’s a non-issue. Rtings set a static image for 120 hours on a Sony OLED and it basically went away with one compensation cycle.
I have a budget Samsung 55" NU7400 and I can’t see shit while playing a PS5 game with HDR during the day. I need to close the blackout curtains otherwise I see my face reflected.
Next TV I buy I will do some research and spend a bit more money, 120Hz, more nits, VRR, etc.
NU7400 has a peak of 337 nits and that’s with the poorer contrast ratio of LCD. My LG C1 is 780 nits. I still find it a bit weak with the lights on so I can’t imagine 330 on LCD.
Yeah, HDR is meant to be watched in a 5-nit environment, but sometimes that’s just not reasonable. While my LG is technically better, bright TV shows like Rings of Power are more enjoyable with the 1500 nits my TCL can output. Once that ABL (Automatic Brightness Limiter) kicks in for the OLED, you absolutely need the blackout curtains.
Thanks for the hints. So that means that in a bright room, a TV with 1500+ nits is ideal for HDR right?
But even with a 1500 nits TV, HDR will be still much better in a dark room? (Where OLED shines?).
One thing that I also noticed is that my monitor (which has i think 350 nits / LG 27GL850-B) it is much easier and clear to see at direct sunlight because of the anti-glare screen.
But I doubt that antiglare/matte displays is a thing you find on TVs.
Here is an alternative Piped link(s):
this RTings video
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.