Published August 28, 2022
| Version v1
Conference paper
PTVR : a user-friendly open-source script programming package to create Virtual Reality experiments
Contributors
Others:
- Laboratoire de psychologie cognitive (LPC) ; Aix Marseille Université (AMU)-Centre National de la Recherche Scientifique (CNRS)
- Biologically plausible Integrative mOdels of the Visual system : towards synergIstic Solutions for visually-Impaired people and artificial visiON (BIOVISION) ; Inria Sophia Antipolis - Méditerranée (CRISAM) ; Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)
- ANR-20-CE19-0018,DEVISE,De la réadaptation aux systèmes d'aide visuelle pour les personnes déficientes visuelles (" basse vision ") : solutions innovantes intégrées dans un environnement de Réalité Virtuelle(2020)
Description
Using Virtual Reality (VR) to investigate visual processing is a growing trend due to the high scientific potential of VR (allowing experiments in highly controlled environments and ecological scenarios), and to the increasing power of ever cheaper VR headsets. However, implementing VR experiments requires non-trivial programming skills that are long to learn. Alleviating this implementation process is thus a great challenge and should be guided by the success of existing script programming packages used to display stimuli on 2D monitors (e.g. the free, and open-source package PsychoPy). A step in this direction was achieved by the «Perception Toolbox for Virtual Reality» (PTVR) package (first presented at ECVP 2018) with the ambition to follow the same Open Science philosophy as PsychoPy but applied to VR. At ECVP 2022, we propose a consolidated and extended version of PTVR with many new features. We will describe, from scratch to finish, how any researcher familiar with Python programming can create and analyze a sophisticated experiment in VR with parsimonious code. A 3D visual search experiment will serve to illustrate the easiness with which: (1) 3D stimuli are positioned thanks to different coordinate systems, (2) online positions of the head, gaze, or remote controllers are used to point at the target interactively, (3) all the data are accurately recorded across time. We will also present the resources allowing researchers to quickly learn PTVR, notably hands-on demos included with PTVR and a website (https://ptvr.inria.fr/) offering an extensive user manual.
Abstract
International audienceAdditional details
Identifiers
- URL
- https://hal.inria.fr/hal-03685492
- URN
- urn:oai:HAL:hal-03685492v1
Origin repository
- Origin repository
- UNICA