We replicated the novel transposed-word effect in grammatically judgements described by Mirault et al. (2018), adding eye-tracking measures recorded in a virtual reality environment. We displayed five-word sequences that could be grammatically correct or not, and with two types of ungrammatical sequence: i) the Transposed-word condition where the ungrammaticality was created by transposing two adjacent words in a grammatically correct sentence (e.g., "The white was cat big") and ii) the Control condition, matched to the transposed-word condition, but which could not generate a correct sentence by transposing two adjacent words (e.g., "The black ran dog fat"). The same number of grammatically correct sentences (e.g., "The little brown rabbit") were intermixed with the ungrammatical sequences for the purposes of the grammaticality judgment task. We tested 40 participants, and the analysis of grammaticality judgments revealed a significantly higher error rate and longer response times in the Transposed-word condition compared to the Control condition. The eyetracking measures further revealed more fixations (refixations and regressions) and longer reading times in the Transposed-word condition compared to the Control condition. We conclude, in accordance with Mirault et al. (2018), that the process of encoding word order information during sentence reading is noisy. Over and above the replication and extension of the Mirault et al. study, we demonstrate the possibility of using eyetracking measures in a virtual reality environment for investigating reading.