We begin this series of articles entitled The Clinical Eye by analysing two basic parameters of all video representation systems,either a monitor, a projector or videowall, for instance: brightness and gamma
In subsequent issues, we shall be delving into further aspects of images and their quality, the reasons behind colour distortions, how to measure the reliability of monitors and how to solve (to the extent possible) the poor adjustments and imperfections that we are unfortunately faced with all the time.
Digitalisation has entailed such a profound change of paradigm in the Audiovisual Industry that it has required a shift in how we think and work. From a technological standpoint,not much of what was acceptable just a few years ago remains in place today. New generations of equipment, tools and applications replacing their previous versionsnever cease to appear in a continuous cycle. While each new generation of products bears improvements compared to its predecessor and we are certainly experiencing one of the most passionate stages of human communication ever, we must admit that these advances are often faster than our ability to grasp them.
On this occasion, our reflections shall be focused on monitors, the last link in the value chain of the AudiovisualIndustry. A monitor is the final connectionbetween technology and human factor, the eyes and the mind and in short, the world of sensations. There will be little point of taking great pains to perfect each stageof the technical-creative process if the programme is displayed on a poorly adjusted or a downright bad monitorfailing to show the dark details or displaying burnt images, or even a green tint on all colours, etc. But what is even worse, what is the point of adjusting cameras or performing colour corrections for hours on end to achieve the wished dramatic tone for the image in our programme if we do it on a poorly adjusted monitor? Or think about when we believe that we have achieved a precious tone reminiscent of the environment we would like to convey but in reality, it turns out that we have used colours totally different to those envisaged…
We must be well aware that monitors are, essentially, analogical pieces of equipment, with everything this implies. The overwhelming digitalisation of production processes, so brilliant as it is practical, assuring that the image will not be degraded during the production, post-production or emission processes, could lead us to believe that monitors are free from sin, or in other words, perfect, simply for having digital inputs. Unfortunately this is not so. The monitor indeed receives the signal in a digital format, but it must convert it to analogical in order to send information to each of the pixels (or more precisely, to each RGB subpixel)on how much light the LCD or OLED monitor needs to let through or generate, respectively. So, the digital to analogical conversion, the processing of the analogical signal and the extreme precision it must have for the colour displayedto be correct, together with the multiple factors that make signals derive from their theoretical value, can result in the actual displayed image to bear little resemblance to the image defined by the digital values that reach the monitor through an SDI video cable.
Read the whole article in: