PiVR: virtual reality for small animals

The Raspberry Pi based Virtual Reality system (PiVR) is a virtual reality system for small animals. It has been developed by David Tadres and Matthieu Louis (Louis Lab).

The tool has been published in PLOS Biology: PiVR: An affordable and versatile closed-loop platform to study unrestrained sensorimotor behavior.

In addition, Scientific American featured PiVR in a delightful article accompanied by a fantastic video.

You can find a 1 hour presentation of PiVR hosted by World Wide Open Source on youtube.

PiVR is used by a growing number of labs for their research. In addition, the affordability and ease of assembly makes PiVR a great tool for teaching.

  • The code is open source (BSD license)

  • The source code for all the PiVR software can be found on Gitlab

  • You can also find a Bug Tracker on Gitlab

larva_banana.jpg

Leonard the larva - always chasing that virtual banana smell.

What can PiVR do?

PiVR has been used to create virtual odor realities for fruit fly larvae.

larval_trajectory.png

Trajectory of a Drosophila larva in a virtual odor reality. The larva expresses the optogenetic tool Chrimson in the Or42a expressing olfactory sensory neuron.


PiVR has also been used to create virtual taste realities for adult fruit flies.

fly_trajectory.png\

Trajectory of an adult Drosophila fly in a virtual taste reality. The fly expresses the optogenetic tool Chrimson in the Gr66a expressing sensory neurons.


PiVR was also used to create a virtual light bulb for a number of animals, including larval zebrafish.

fish_trajectory.png

Trajectory of a zebrafish (D. rerio) larva exposed to a virtual white light source.


PiVR is also able to create dynamic virtual gradients

While it is often convenient to present static virtual gradients (see examples above) animals usually have to navigate an environment that is changing over time. PiVR is able to present animals with dynamic virtual realities.

We presented Drosophila larva expressing the optogenetic tool Chrimson in the Or42a olfactory sensory neuron with a dynamic odor plume based on the measurement of a real odor plume (Álvarez-Salvado et. al.,). PiVR thus enables researchers to study how Drosophila larvae are orienting themselves in a more naturalistic environments.


How does it work?

PiVR combines high speed/low latency tracking with temporally defined light stimuli to create arbitrary virtual realities.

PiVR is capable of tracking Drosophila larvae, adult flies, zebrafish and many other small animals.

The position of the animal in the real world is then mapped onto the user provided virtual reality and the appropriate stimulus is presented.


Sounds great. I want one! How?

PiVR has been designed to make building one as easy as possible so that you do not spend a lot of time building the setup and spend more time running experiments.

Please follow the Build your own PiVR to see step-by-step instructions on how to build your own PiVR setup.

Don’t worry, it’s not hard and it won’t take too long. Please see the timelapse video below for an example of one setup being built: from 3D printing to running experiments!


I’ve got a setup. How do I use it?

If you are a first time user, check out the Step-By-Step Guide which will walk you through each of the four recording modes:

  1. Tracking single animal

  2. Virtual Reality experiments

  3. Image Sequence Recording

  4. Video Recording

You have just run an experiment. What to make of the output data? See here to understand what each output file means and what it contains.

To see how PiVR can help you analyse data check out the tools available on the PC version of PiVR.

Advanced documentation

If you are running into trouble with the closed loop tracking, please head over to How to simulate real time tracking.

If you want to track an animal that is not available under Select Animal, please read the How to define a new animal chapter.

If you want to understand what each button in the GUI is doing, please see the PiVR Software Manual.

If you want to gain a high-level understanding on how the code identifies the animal and tracks them please read the Code Explanation

The annotated source code can be found here and on the Gitlab page.

Content