A “nit” is another way to describe a brightness of 1 candela per square meter (cd/m2). A movie theater screen, in your average movie theater, can probably get as bright as about 50 nits. If your TV is a few years old, pre-HDR, it can probably reach between 100 and 400 nits. Plasmas (now defunct) would be on the low side of that, while high-end LCDs on the other side.
TV manufacturers have always striven to create bright televisions. The brightest TV is the one that sold, or so the old adage went. Now, in the HDR era, this brightness has another purpose: picture quality. One of the main aspects to HDR performanceis creating realistic highlights. The brighter these small areas of the screen are, the better. Imagine, for example, a glint off an aircraft’s metallic skin. In real life, this with be significantly brighter than the rest of the scene. On a great HDR TV, it is as well.
This isn’t to say a 2,000 nit TV is going to always look better than a 1,500 nit TV, but it can be a factor. Brightness (nits) is only one half of the all-important contrast ratio equation; the other is black level. Meanwhile new technologies like quantum dots are pushing overall performance, including brightness, to levels we couldn’t have imagined 10 years ago.
Modern TVs can be much brighter, with the top-of-the-line HDR TVs putting out over 1,500 nits. In the next few years, we’ll likely see even higher light outputs.