In the last years, virtual reality technology has experienced a boost in affordability, and an increasingly number of applications have emerged proposing new immersive 360 degrees visual content. To make this content accessible for low vision people, one should adopt the same strategies as in traditional displays, i.e., use dedicated image enhancement methods to facilitate their interpretation. This work introduces a virtual reality application for mobile devices that implements real-time content enhancement. It is implemented as a visual search task in a set of static 360 degrees environments: the immersed user can manipulate the parameters of the enhancement algorithm in a intuitive way, using an external controller. In particular, we focus on the transform proposed by Peli et al (IOVS, 1991), which is based on an adaptive filter that controls the local contrast as a function of the local mean luminance of an image. Such a transform has been shown to improve recognition tasks in patients with moderate visual loss, central scotoma or cataracts. Our application is, to our knowledge, the first attempt to evaluate the impact of this image enhancement in an immersive virtual reality environment. In particular, our system allows the real time tuning of the transform, and provides all the quantitative data to analyse a posteriori users’ behavior and how parameters may impact their performance. Designed as a game, it is perceived as more enjoyable than traditional ophthalmologic experiments. More generally, this application could be a way for low vision people to adjust vision enhancements to their needs in everyday virtual reality applications, also for entertainment purposes.