The methodology used in this work can be divided into two steps: (1) Colorimetric characterization of the VR display and (2) implementation of a color management system adapted to VR. Each step is explained in detail below.
2.1. Chromatic Characterization of a VR Device
The first step to use a virtual reality system in tasks related to color vision research is the chromatic characterization of the Head Mounted Display (HMD). Each device of this type has its own specific characteristics in terms of the chromaticity of its primary colors and its medium white point, as well as the relationship between the digital values of the analog-to-digital converter (DAC) and the associated tristimulus values of XYZ.
To chromatically characterize the VR device used in this study, spectroradiometric measurements were made with a tele-spectroradiometer aligned with the optical axis of the lenses with which the HMD is equipped. We measured over the display and lens assembly as a whole, leaving the measurement of the screen with and without lenses at different points of the screen for future studies. These lenses allow the user to correctly position their eyes on the displays and obtain an image from the displays with a large visual field. On the negative side, there is an increase in the image size of the pixels that makes those pixels perceptible to users. The values of chromaticity and the average relative luminance of both displays are shown in
Table 1.
The measured spectral power distribution of the RGB primaries is shown in
Figure 3. The spectral radiance of each channel reveals the OLED nature of these displays with a narrow bandwidth for each channel’s RGB.
The color gamut is a subset of colors that can be accurately represented in a given color space or by a certain output device like a display. In this work, we measured the color gamut of our HTC Vive device by comparing its color gamut with that of other devices, such as an Oculus Rift CV1 and classic CRT and TFT monitors (
Figure 4).
We analyzed the relationship between the values of the digital-to-analog converter (DAC) of each RGB channel and their corresponding values of luminance Y (
Figure 5). The measurements were made using our tele-spectroradiometer for each of the R, G, and B chromatic channels independently, with a range of DAC values from 0 to 255 and a step of five units.
As a result of this analysis and considering the computational time constraints of VR systems, a linear chromatic characterization model preceded by a gamma linearization stage was used. This simplified color characterization model is widely used in color management [
23].
This model uses a typical linear transformation between the RGB’ values and the normalized
XYZ tristimulus values with a 3 × 3 matrix (Equation (2)). The RGB’ values were obtained after a gamma correction of the normalized RGB values that guaranteed the linearity of the system (Equation (1)).
Table 1 shows the gamma value of each RGB channel and the statistical measurement of the R
2 fit index.
To confirm the goodness of the color characterization model, we measured 50 random RGB color samples. These values were compared to those predicted by the mathematical model, thereby obtaining an average color difference of ∆E
00 = 1.8. All measurement data together with the MATLAB script used to obtain the chromatic characterization model are available as
Supplementary Material for this publication.
2.2. Implementation of a Color Management Procedure Adapted to VR Systems
After performing the spectral characterization of the HMD, the next necessary step to obtain a faithful reproduction of the color inside a Virtual Reality system is to introduce a color management procedure into the 3D graphics engine. The different combinations of lighting configurations in the 3D software used are practically unlimited; for example, it is possible to program different ways to perform the rendering using different shaders. The final appearance of the virtual reality scene will depend on the color of the light source used, the color of the material, the gloss, and the interaction of the different elements that form the virtual scene with shading, primary and secondary reflections, etc. For all these reasons, we assumed a series of simplifications that allow us to deal with the problem:
We focus on the color matter, disregarding the participation of glossy objects and deactivating the secondary reflections.
We limited the 3D software processing to real time processing, disabling the Baked and Global illumination options.
We used Unity’s standard shader and configured the player using its linear option with forward rendering activated.
By selecting these options, we aimed to establish a configuration to analyze and compare the results of the implemented color management system.
3D scene rendering engines do not use any default color management systems. The native format used to define both light sources and object textures in this type of software is sRGB digital color space with a bit depth of 8 bits per channel. This sRGB space is widely used in computer science and image processing and is characterized by a specific gamut, defined by the chromaticity of the primaries and by a non-linear transformation (gamma) of approximately 2.2. The media White Point of this color space is D65.
The color management procedure implemented in this work has two levels of accuracy. In the first level, a C# script was implemented, which allowed to calculate the RGB values of a simulated light source in the VR scene starting from the spectral power distribution of the source and the spectral characterization of the HMD used.
The second level of precision requires the introduction of the spectral texture of the virtual objects present in the virtual scene. For this part, we developed a C# script that performs all the computational processing of the virtual object texture, thereby generating a different RGB texture for each lighting change. Notably, although in virtual reality systems the rendering is performed with a minimum frequency of 90 Hz, this rendering is done with the same light sources and RGB textures, unless they are changed at run time.
Figure 6 shows a flux diagram for both levels of the color management procedure developed. The first level is only applied to the virtual light sources defined inside the virtual scene. The second level requires one to apply the first one and calculate the image texture of each 3D object starting from its hyperspectral image. The entire C# script for both procedures is attached as
Supplementary Material for this paper.
To analyze the results obtained by introducing both levels of color management, we used a sample of the ColorChecker color chart (X-rite, USA). This color chart is widely used in color management tasks in both scientific and professional fields. The manufacturer of this color chart provides the sRGB reference values that the color patches must present under D65 lighting. These values are presented in the first three columns of
Table 2. The ColorChecker was scanned by a 3D color scanner, which provided the geometry of the object, as well as the color texture. The geometry is defined in an OBJ file as a dot-cloud. The color texture is defined by an 8-bit per channel BMP color file obtained under a D50 LED light source that the scanner is equipped with.
Our main objective is to compare the efficiency of different color management methods in real-time 3D virtual environments, such as VR. The great challenge of this work is to perform the rendering in real time, thereby solving the computational complexities that exist. To carry out this comparative study, we require not only the geometry and color texture of the ColorChecker but also the hyperspectral texture—that is, the spectral reflectance of each point of the color chart defined in the color texture file. Since this object is flat, we obtained the hyperspectral texture via a hyperspectral camera, model UHD 285 (CubertGmbH, Ulm, Germany). In this way, we replaced the RGB color texture obtained from the 3D scanner with a hyperspectral texture defined between 400 and 1000 nm, using the 4 nm steps provided by the hyperspectral camera. Starting from this hyperspectral texture file, we calculated the average RGB values of each color patch of ColorChecker corresponding to the D65 illuminant and sRGB color space. We then compared these calculated RGB values with the theorical RGB values provided by the manufacturer.
Table 2 shows the reference values specified by the manufacturer and the values obtained in our calculations.
To study the effects of these two levels of implemented color management, we used four different virtual light sources: a D65 illuminant, a D65 simulator obtained from a commercial lightbooth in our laboratory featuring 6-peak LED technology, and a theoretical LED source composed only of two spectral peaks chosen in such a way that the color of this source over a diffuse reflectance target coincides exactly with the color of the D65 illuminant.
Figure 7 shows the spectral power distribution of all light sources employed in this work.
Our 3D Graphics Engine uses sRGB as the native color space, and this space uses a medium-white-point, corresponding to the D65 illuminant. The CIE 1931 XYZ tristimulus values = (95.047 100.0 108.88) of this illuminant correspond to the digital 8-bit RGB values = (255, 255, 255). To prevent the incorrect definition of any light source whose X or Z values are above those corresponding to the illuminant D65, we chose to work with normalized sources whose relative luminance would be 85% that of the Illuminant D65. Therefore, we define the custom Illuminant D65 source as XYZ = (80.82, 85.00, 92.54), which corresponds to an RGB value = (237, 237, 237).
Table 3 shows the XYZ and RGB values of the virtual light sources used, as well as their chromaticity values and Correlated Color Temperature (CCT).
The color of the objects present in a scene depends on the objects themselves but also on the light source illuminating them. The quality of a light source in terms of the fidelity of the colors it generates in a scene compared to those obtained by illuminating the scene with a reference light source can be calculated using the Color Fidelity Index R
f, defined by the International Lighting Commission (CIE) [
24]. This value is included in
Table 3, which describes the characteristics of the light sources used.
To assess the efficiency of the implemented color management system, a scene was designed in our 3D graphics engine, where only the virtual ColorChecker is illuminated by a single directional light source. The RGB values shown in
Table 2 for different virtual light sources were assigned to this directional source to simulate this aspect of the ColorChecker under different light sources. In this way, the effectiveness of the first level of color management implemented (first row of
Figure 8) was tested. Subsequently, the second level of color management was enabled, in which the chromatic textures of the 3D objects were recalculated according to the light source used, in addition to the actions performed at the first level (second row of
Figure 8).
Figure 8 shows the results obtained when applying the two levels of color management within the 3D scene equipped with a virtual ColorChecker. At first glance, it is difficult to see the difference between the two levels of color management, except in the case of the two-peak source. In this case, the color difference is evident and results from the low Color Fidelity Index of that source, which is composed of only two spectral peaks, one blue and one yellow, making it impossible for this light source to reproduce any reddish or greenish tone. To more accurately evaluate the efficiency of both levels of color management,
Table 4 shows the average color differences for each RGB channel. The RGB values measured from the screenshots are compared to those theoretically calculated from the spectral reflectance of each color patch of the ColorChecker using the different light sources. These color differences were calculated on each digital RGB channel since RGB is the native value of the implemented color management system.
In all cases, an improvement in color fidelity can be observed by applying the second level of color management. However, this difference is small, except for the case of the light source composed of two spectral peaks. There is no absolute criterion that allows knowing when it is enough to apply the first level of color management and when it is necessary to apply the two levels of color management, since it will depend on the spectral power distribution of the light source used and its interaction with the spectral reflectance of the materials of the objects used in the VR scene. In this work, it is pointed out that the Color Fidelity Index can be an indicator of when a light source may require the use of the second level of color management but setting a CFI reference value requires further research.