A Brief Glossary of Light Concepts
The Key Differences with Lumen vs Nits
Can Lumens and Nits Transfer to Each Other? How?
Before we can proceed with comparing and contrasting these two units of light, we think it is best to cover some basic terminology.
Put simply, a lumen is the total amount of visible light that emits from a light source. More precisely it is a unit of measurement intended for gauging luminous flux, more commonly known as "lux." If you are dealing with anything that sheds light, even a simple tv or remote with a few lights to indicate its being used or sending a signal, some level of light is being generated; when light is generated, it can be measured in lumens.
That said, the most common items that explicitly draw attention to their lumens would be lightbulbs and projectors. With lightbulbs, customers need to know just how much brighter they can expect things to get when they replace the bulb on the light socket of a ceiling fan or a personal flashlight. With a projector, people want to have an idea of just how sharp the images will contrast with whatever surface the projector is using as a screen.
Since a lightbulb's lumens will be proportionate to its wattage, here are four different lumen benchmarks and their accompanying wattages for incandescent and LED lightbulbs. In other words, if you are looking for a certain level of brilliance, this should give you a good baseline for your needs.
This particular term is an industry unit of measurement for gauging luminance but it is genuinely synonymous with candela. More exactingly, a single nit equates to 1 candela per square meter. The term "nit," comes from "nitere," meaning "to shine" in Latin.
When looking into the marketplaces for lighting equipment, your average consumer LCD display, that is the kind that is intended for indoor use, will generate a luminance anywhere between 200 and 350 nits. Obviously, LCD displays that are intended to shine from outside are a good deal brighter; the ones that are only intended to shine while the sun is out can generate anywhere from 400 to 700 nits, with that amount surging past 1,000 for displays intended to be seen in direct sunlight.
If you prefer LED surfaces, you can find models capable of producing anywhere from 1,000 to 5,000 nits. If you are running some event or other large-scale event and really want your images to be seen without any lost quality, these are perfect. Now if you move to LED screens that are intended for just the indoors, like a scoreboard, these can hit 2,000 nits; outdoor screens like those used by digital billboards can easily exceed 8,000 nits. The greatest influence on the brilliance of an LED wall/screen is how much direct sunlight will hit it; shady areas can make due with lower nits but constant exposure to direct sunlight requires a sufficiently bright screen.
With modern projectors, you can expect anywhere from 1,000 to 2,000 nits, meaning they can emit very bright light for any situation. That said, nits are just one factor when assessing a projector's overall brightness; you must also consider the size and type of screen being used. A high-nit projector may produce a dim image with a small screen and a low-night projector can produce a brilliant image on a large screen. In short, the larger the screen used, the more nits are needed to optimize the brightness.
Your current generation of smartphones and tablets can generate a great deal of light, with the average smartphone easily capable of generating anywhere from 200 to 1,000 nits or more. The need for such high numbers mostly boils down to when you are viewing a lot of things involving white, like reading a digital book's black letters against a white background. Conversely, your average laptop or monitor only generates somewhere between 200 to over 600 nits because they tend to stay inside when used. If you are just browsing the web, streaming media, and working with text documents, 200 nits is more than sufficient; anyone who does a lot of gaming or editing of multimedia on their machine is going to want considerably more than 200 nits.
Televisions can have some of the greatest variance in nit count because the technology has been around long enough spawn numerous varieties and iterations. Your average OLED TV may generate as little as 30 nits while a plasma screen can easily generate over 600 nits. The average modern TV generates around 450 nits, some models, like HDR units, can generate an excess of 2,000 nits. Potential is great but most TVs tend to produce an optimal image at the 100-300 nit range; if a TV cannot properly display a rich, void-like black, highlights will be less notable.
While both of these terms are used to gauge light output, nits measure the brightness of an object and lumens measure the light being shed by an object.
Because there is some degree of overlap between the two terms, you can easily abstract how many of one is needed to yield the other.
Nits and lumens are certainly both important to consider when deciding on a light-emitting device but they are not the only factor you should consider. You should also be mindful of the device's color accuracy, contrast, interface, refresh rate, resolution, and viewing angle. Hopefully you now have a much greater grasp on the distinction between these two terms and feel better empowered to decide what the ideal device(s) are for your specific display needs. When shopping around, establish a baseline for low, medium, and high nit devices for your needs.