
91亚色 research is shedding light on how a popular software powering virtual reality (VR) can transform visual information, offering tools that help researchers run more precise and reliable experiments in immersive environments.
VR has become an important tool in vision science for studying how humans see and interact with the world. By working inside immersive VR environments, researchers can test colour perception, navigation and more in three鈥慸imensional settings that more closely resemble everyday experiences than traditional laboratory displays.
To create these virtual testing environments, scholars and scientists rely on game engines 鈥 software platforms originally developed for video games. An engine called Unity has become especially popular in vision research because of capabilities that allow scientists to build scenes with realistic lighting, situate objects at precise locations in three鈥慸imensional space and respond in real time to a participant鈥檚 movements.
For example, VR has been used to study colour constancy 鈥 why an object appears to stay the same colour even as lighting changes. Using Unity, scientists can place everyday objects into a virtual scene and alter the colour or intensity of the light while asking participants to judge or match what they see.

Many of Unity鈥檚 features are automated and adaptive. As game players move through a virtual world 鈥 stepping from bright outdoor areas into dark interiors 鈥 the engine automatically adjusts brightness and colour so that details remain visible.
For those who need exact control over brightness and colour, that can be a problem.
鈥淚f researchers and developers don't know precisely what images they're showing in VR, it makes their work less reliable and less replicable,鈥 says Richard Murray, professor at the and director of the Murray Lab, which studies human visual perception.
If a scientist believes they are presenting a stimulus with a specific colour value, but the software transforms it in hidden ways before it reaches a screen or headset, then study results can be subtly distorted. Small differences in display behaviour can make it unclear whether an observed effect reflects human vision or the tools used to present the experiment.
Curious about how these hidden transformations might affect how accurate human vision is in VR, Murray set out to examine how Unity processes lighting, materials and colour before visual stimuli appear on a screen or VR headset.
In work published in the , Murray turned his attention to Unity鈥檚 High Definition Render Pipeline (HDRP), the graphics system responsible for translating numbers that define lighting, materials and colours inside the software into the images people see on a screen or headset.
Murray reverse-engineered HDRP鈥檚 rendering process into a series of steps, carefully describing how numerical values for lighting and colour are transformed for screens. He then checked those predictions against Unity鈥檚 actual output and real鈥憌orld display measurements by generating thousands of virtual scenes with different lighting and colour settings to see whether Unity鈥檚 output behaved as expected.
Murray found that, under Unity鈥檚 default settings, brightness and colour can behave in ways that are difficult to anticipate. Visual changes that seem straightforward in an experiment can be subtly reshaped by the software.
As a result, what participants see may not line up exactly with what a researcher intended.
This does not disqualify Unity as an impactful tool, however.
鈥淔rom decades of work on human vision, researchers have developed a good understanding of how we need to be able to control images in order to run reliable experiments,鈥 says Murray. With that knowledge, Murray says, discrepancies can be addressed by treating the game engine like any other complex scientific instrument that requires careful calibration.
With appropriate configuration, Unity can be made to display luminance and colour with high precision 鈥 often close to the physical limits of the display. Additionally, he stresses that precision depends on an ongoing process. Project leaders must measure what the system actually displays, apply corrections based on those measurements and repeat the process whenever hardware or software conditions change.
鈥淢y goal with this project was to not only develop a mathematical model of Unity, but also software tools that would allow researchers to have greater control over experiments in VR,鈥 says Murray, who has made shared open鈥憇ource Unity projects and software tools available to others.
These tools show how to configure Unity for studies, how to run simple calibration checks inside the experiment and how to apply corrections based on real display measurements. They are designed to be reused and adapted, making careful calibration part of everyday VR investigative practice.
By making it possible to trust what participants actually see in VR, Murray鈥檚 work ensures that insights drawn from the use of the medium in studies rest on solid visual foundations. In doing so, it can remain a dependable platform for studying how people see, perceive and interact with the world.
鈥淚 hope that it will enable researchers in a wide range of fields to make better, more informative and more reliable experiments ,鈥 Murray says. 鈥淰R is a technology that will be around for a long time, including in research, so it will be important to get a better understanding of when it provides us with realistic visual environments and when it doesn't.鈥
